Haoyang Du
Title: Talk to me: Creating plausible speech-driven conversational characters and gestures
Supervision Team: Cathy Ennis, TU Dublin / Rachel McDonnell, TCD / Benjamin Cowan, UCD and Julie Berndsen, UCD
Description: Interaction with virtual characters has provided increased engagement and opportunities for immersion for players in a social context for many purposes. With the advance of spaces like the Metaverse and applications such as ChatGPT, the demand for engaging virtual characters who can generate plausible gestures and behaviours for speech will only increase. In any space that allows for embodied interaction, when players/users can be represented by a virtual avatar or where they interact with a virtual character, exchanges can become more engaging. However, the requirements of real-time dynamic interactions pose a serious challenge for developers; plausible and engaging behaviour and animation for these characters is required in scenarios where it is impossible to script exactly what types of actions might be required. We aim to tackle part of this problem by investigating speech driven non-verbal social behaviours for virtual avatars (such as conversational body motion and gestures) and develop ways to generate plausible interactions with them in real-time interactive scenarios.