top of page

Acerca de


CLIPE workshop


23 April 2024. Limassol (Cyprus) co-located with Eurographics 2024


Advances in technology are pushing towards making VR/AR worlds a daily experience. Whilst virtual characters are an important component of these worlds, bringing them to life and giving them interaction and communication abilities requires highly specialized programming combined with artistic skills, and considerable investments: millions spent on countless coders and designers to develop video-games is a typical example. The objective of CLIPE is to design the next-generation of VR-ready characters. This workshop is part of an MSCA-ITN project also called CLIPE ( that started in 2020 and has been addressing the most important current aspects of the problem, making the characters capable of:​

  • Behaving more naturally;

  • Interacting with real users sharing a virtual experience with them;

  • Being more intuitively and extensively controllable for virtual worlds designers.

To meet the goals, there is a need for multidisciplinary research on VR/AR, computer graphics, computer animation, psychology and perception. This workshop will present the results of the CLIPE project, but is also opened to publications from other researchers.


This workshop will be co-Located with Eurographics 2024 and invites researchers to submit original research papers related to modeling and animation of humans and/or crowds. The workshop welcomes both technical papers and position papers, and authors may also submit work in progress. Relevant topics include, but are not limited to:

  • Modeling and animation of humanoid characters

  • authoring virtual human behavior

  • Expressive character animation (e.g. face, gaze, gestures, ...)

  • Capture of human appearance and movement.

  • Perceptual studies to evaluate character simulation quality and communication abilities.

  • ML and deep-learning algorithms for character animation

  • Avatars in VR/AR and embodiment

  • Social interaction and groups of virtual humans

  • Navigation for autonomous characters

  • Crowd simulation

  • Motion Capture to facilitate character animation 

  • Computer vision method for the reconstruction of shape, geometry or animation of humans

  • Authoring virtual humans' behavior or animations



  • Submission deadline EXTENDED: 16 February 2024 (Anywhere on Earth)

  • Notification of acceptance: 8 March 2024

  • Camera-ready deadline: 29 March 2024

  • Workshop date: 23 April 2024


You can download the LaTeX package from:


Yiorgos Chrysanthou
Research Director of the CYENS Centre of Excellence
Professor at the University of Cyprus


Nuria Pelechano
Associate Professor
Universitat Politècnica de Catalunya

Julien Pettré
Inria Research Scientist


Eduardo Alvarado, Ecole Polytechnique   

Nefeli Andreou, University of Cyprus

Klara Brandstaetter, University College London

Kiran Chhatre, KTH Royal Institute of Technology

Radek Daněček, Max Planck Institute for Intelligent Systems

Lisa Izzouzi, University College London

Ariel Kwiatkowski, Ecole Polytechnique

Marilena Lemonari, University of Cyprus

Mirela Ostrek, Max Planck Institute for Intelligent Systems

Jose Luis Ponton, Universitat Politecnica de Catalunya

Bharat Vyas, Trinity College Dublin

Yuliang Xiu, Max Planck Institute for Intelligent Systems

Tairan Yin, Inria

Haoran Yun, Universitat Politecnica de Catalunya ​

PROGRAM Tuesday 23rd. April

Session 1 - 9:00-10:30  “Character animation and simulation for VR – CLIPE results 1”

9:00 Tairan Yin -  “The One-Man-Crowd:  Towards Single-User Capture of Collective Motions using Virtual Reality”

9:15 Haoran Yun – “Real-time Avatar Animation Synthesis in Virtual Reality”

9:30 Lisa Izzouzi – “Social Evaluation”

9:45 Klara Brandstaetter – “Interaction by demonstration”

10:00 Eduardo Alvarado – “Efficient Models for Human Locomotion and Interaction in Natural Environments”

10:15 Nefeli Andreou – “Multimodal Generation of Realistic Human Bodies”


Session 2 – 11:00 – 12:30 “Mocap and Authoring virtual humans – submitted work”

11:00 Jean-Benoit Culié, Bojan Mijatovic, David Panzoli, Davud Nesimovic, Stéphane Sanchez, and Selma Rizvic – “A CRITS foray into cultural heritage: background characters for the SHELeadersVR project”

11:15 Panayiotis Kyriakou, Marios Kyriakou, and Yiorgos Chrysanthou – “Overcoming Challenges of Cycling Motion Capturing and Building a Comprehensive Dataset”

11:30 Eduardo Parrilla, Alfredo Ballester, Jordi Uriel, Ana V. Ruescas-Nicolau and Sandra Alemany - “Capture and Automatic Production of Digital Humans in Real Motion with a Temporal 3D Scanner”

11:45 Marilena Lemonari, Nefeli Andreou, Nuria Pelechano, Panayiotis Charalambous and Yiorgos Chrysanthou - “LexiCrowd: A Learning Paradigm towards Text to Behaviour Parameters for Crowds”

12:00 Froso Sarri, Panagiotis Kasnesis, Spyridon Symeonidis, Ioannis Th. Paraskevopoulos, Sotiris Diplaris, Federico Posteraro, George Georgoudis, and Katerina Mania – “Embodied Augmented Reality for Lower Limb Rehabilitation”

12:15 Vinu Kamalasanan, Melanie Krüger, and Monika Sester – “Interacting with a virtual cyclist in Mixed reality affects pedestrian walking”


Session 3  15:30-17:15 “Capturing and simulating virtual humans - CLIPE results 2”

15:30 Marilena Lemonari – “Authoring Crowd by Narratives”

15: 45 Bharat Vyas – “Physiology driven variation of human animation based on body type”

16:00 Kiran Chhatre – “Adaptive communicative social behaviours for virtual characters in small conversational groups”

16:15 Ariel Kwiatkowski – “Reinforcement learning to simulate virtual characters”

16:30 Radeck Daněček (video) – “Emotion-driven face and body capture and animation”

16:45 Yuliang Xiu  (video) – “Reconstructing fully clothed characters from images“

17:00 Mirela (video) – “Immersive characters for Mixed Reality Scenes”

bottom of page