Children Interpretation of Emotional Body Language Displayed by a Robot

Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3]. Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.

Publication type: 
Contributo in volume
Author or Creator: 
Beck, Aryel
Cañamero, Lola
Damiano, Luisa
Sommavilla, Giacomo
Tesser, Fabio
Cosi, Piero
Publisher: 
Springer, Berlin Heidelberg, DEU
Source: 
Proceedings of ICSR 2011 - Third International Conference on Social Robotics, edited by B.Mutlu, C. Bartneck, J. Ham, V. Evers, T. Kanda, pp. 62–70. Berlin Heidelberg: Springer, 2011
Date: 
2011
Resource Identifier: 
http://www.cnr.it/prodotto/i/203942
https://dx.doi.org/10.1007/978-3-642-25504-5_7
info:doi:10.1007/978-3-642-25504-5_7
http://www.springerlink.com/content/l26718l752k226ph
urn:isbn:978-3-642-25503-8
Language: 
Eng
ISTC Author: 
Fabio Tesser's picture
Real name: 
Piero Cosi's picture
Real name: