PAD-based multimodal affective fusion

S. W. Gilroy, M. Cavazza, Markus Niiranen, E. Andre, T. Vogt, J. Urbain, M. Benayoun, H. Seichter, M. Billinghurst

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

26 Citations (Scopus)

Abstract

The study of multimodality is comparatively less developed for affective interfaces than for their traditional counterparts. However, one condition for the successful development of affective interface technologies is the development of frameworks for the real-time multimodal fusion. In this paper, we describe an approach to multimodal affective fusion, which relies on a dimensional model, Pleasure-Arousal-Dominance (PAD) to support the fusion of affective modalities, each input modality being represented as a PAD vector. We describe how this model supports both affective content fusion and temporal fusion within a unified approach. We report results from early user studies which confirm the existence of a correlation between measured affective input and user temperament scores. (45 refs.)
Original languageEnglish
Title of host publicationProceedings of the 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009
Place of PublicationPiscataway, NJ, USA
PublisherIEEE Institute of Electrical and Electronic Engineers
Number of pages8
ISBN (Print)978-1-4244-4800-5, 978-1-4244-4799-2
Publication statusPublished - 2009
MoE publication typeA4 Article in a conference publication
Event3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009 - Amsterdam, Netherlands
Duration: 10 Sept 200912 Sept 2009

Conference

Conference3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009
Abbreviated titleACII 2009
Country/TerritoryNetherlands
CityAmsterdam
Period10/09/0912/09/09

Fingerprint

Dive into the research topics of 'PAD-based multimodal affective fusion'. Together they form a unique fingerprint.

Cite this