Can a robot speak like a human?

Imagine to have a face-to-face conversation with an intelligent virtual agent. At ISTC the Speech and Multimodal Communication Laboratory (SMCL) completed the first step to reach this goal, starting precisely from the "face": the result was LUCIA, an emotive talking head. 

Human-computer interaction is widely investigated in many research fields. The possible applications go from dialogic systems for information access and e-commerce services to e-learning tutoring for teaching language skills, by way of animation of avatars in virtual environment and computer games.

But is it possible to build completely autonomous agents? An intelligent robot should not only mimic human actions: it should also behave like a real person. Eye movements, facial expressions, appropriate gestures: it is very hard to catch all these nuances in an artificial system. At ISTC the Speech and Multimodal Communication Laboratory (SMCL) tried to put out this challenge developing a three-dimensional animated computer talking head. Its name is LUCIA and it produces emotive and expressive natural speech with a big variety of facial expressions and labial movement. LUCIA emulates human mimic muscles by the use of specific functions selectively activated. These facial animation parameters are fundamental for achieving a natural movement, which is also regulated by intensity and duration constraints.

SMCL's aim was to came up to a model capable of simulating an emotional behavior. In order to do that, the voice was an essential factor, as the transmission of emotions passes primarily through speech communication. In order to program LUCIA's voice, SMCL team designed an integrated software which was able to emulate a real human by reproducing the movements of some markers positioned on his face and then recorded. 

Contact: Piero Cosi

ISTC Group: Speech and Multimodal Communication Laboratory