Integration of gestures and speech in human-robot interaction (original) (raw)
2012, 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom)
We present an approach to enhance the interaction abilities of the Nao humanoid robot by extending its communicative behavior with non-verbal gestures (hand and head movements, and gaze following). A set of nonverbal gestures were identified that Nao could use for enhancing its presentation and turn-management capabilities in conversational interactions. We discuss our approach for modeling and synthesizing gestures on the Nao robot. A scheme for system evaluation that compares the values of users' expectations and actual experiences has been presented. We found that open arm gestures, head movements and gaze following could significantly enhance Nao's ability to be expressive and appear lively, and to engage human users in conversational interactions.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.