Inferring emotional information from vocal and visual cues: A cross-cultural comparison

  • Maria Teresa Riviello
  • , Anna Esposito
  • , Mohamed Chetouani
  • , David Cohen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

The present work reports results of perceptual experiments aimed to explore the human ability to recognize emotional expressions through the visual and auditory channel, investigating if one channel is more effective than the other to infer emotional information and if this effectiveness is affected by the cultural context and in particular by the language. To this aim American, French, and Italian subjects were involved in a comparative analysis of subjective perceptions of emotional states dynamically portrayed by visual and vocal cues, exploiting a cross-modal database of verbal and non-verbal American video clips. What should make the difference is that while one group of participants is native speaker of the language and belongs to the same cultural context of the video-clips used as stimuli (the American English), the other two are not. Results showed that emotional information is affected by the communication mode and that language plays a role.

Original languageEnglish
Title of host publication2011 2nd International Conference on Cognitive Infocommunications, CogInfoCom 2011
StatePublished - 2011
Externally publishedYes
Event2011 2nd International Conference on Cognitive Infocommunications, CogInfoCom 2011 - Budapest, Hungary
Duration: 7 Jul 20119 Jul 2011

Publication series

Name2011 2nd International Conference on Cognitive Infocommunications, CogInfoCom 2011

Conference

Conference2011 2nd International Conference on Cognitive Infocommunications, CogInfoCom 2011
Country/TerritoryHungary
CityBudapest
Period7/07/119/07/11

Fingerprint

Dive into the research topics of 'Inferring emotional information from vocal and visual cues: A cross-cultural comparison'. Together they form a unique fingerprint.

Cite this