Requirements and software framework for adaptive multimodal affect recognition

Elena Vildjiounaite, Vesa Kyllönen, Olli Vuorinen, Satu-Marja Mäkelä, Tommi Keränen, Markus Niiranen, Jouni Knuutinen, Johannes Peltola

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

1 Citation (Scopus)

Abstract

This work presents a software framework for real time multimodal affect recognition. The framework supports categorical emotional models and simultaneous classification of emotional states along different dimensions. The framework also allows to incorporate diverse approaches to multimodal fusion, proposed by the current state of the art, as well as to adapt to context-dependency of expressing emotions and to different application requirements. The results of using the framework in audio-video based emotion recognition of an audience of different shows (this is a useful information because emotions of co-located people affect each other) confirm the capability of the framework to provide desired functionalities conveniently and demonstrate that use of contextual information increases recognition accuracy. (21 refs.)
Original languageEnglish
Title of host publication2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops
Place of PublicationPiscataway, NJ, USA
PublisherIEEE Institute of Electrical and Electronic Engineers
ISBN (Electronic)978-1-4244-4799-2
ISBN (Print)978-1-4244-4800-5
DOIs
Publication statusPublished - 2009
MoE publication typeA4 Article in a conference publication
Event3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009 - Amsterdam, Netherlands
Duration: 10 Sep 200912 Sep 2009

Conference

Conference3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009
Abbreviated titleACII 2009
CountryNetherlands
CityAmsterdam
Period10/09/0912/09/09

Fingerprint

Fusion reactions

Keywords

  • emotion recognition
  • labeling
  • mood

Cite this

Vildjiounaite, E., Kyllönen, V., Vuorinen, O., Mäkelä, S-M., Keränen, T., Niiranen, M., ... Peltola, J. (2009). Requirements and software framework for adaptive multimodal affect recognition. In 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops Piscataway, NJ, USA: IEEE Institute of Electrical and Electronic Engineers . https://doi.org/10.1109/ACII.2009.5349393
Vildjiounaite, Elena ; Kyllönen, Vesa ; Vuorinen, Olli ; Mäkelä, Satu-Marja ; Keränen, Tommi ; Niiranen, Markus ; Knuutinen, Jouni ; Peltola, Johannes. / Requirements and software framework for adaptive multimodal affect recognition. 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. Piscataway, NJ, USA : IEEE Institute of Electrical and Electronic Engineers , 2009.
@inproceedings{5714b4eff7274ae3bfbd6968e366b8f3,
title = "Requirements and software framework for adaptive multimodal affect recognition",
abstract = "This work presents a software framework for real time multimodal affect recognition. The framework supports categorical emotional models and simultaneous classification of emotional states along different dimensions. The framework also allows to incorporate diverse approaches to multimodal fusion, proposed by the current state of the art, as well as to adapt to context-dependency of expressing emotions and to different application requirements. The results of using the framework in audio-video based emotion recognition of an audience of different shows (this is a useful information because emotions of co-located people affect each other) confirm the capability of the framework to provide desired functionalities conveniently and demonstrate that use of contextual information increases recognition accuracy. (21 refs.)",
keywords = "emotion recognition, labeling, mood",
author = "Elena Vildjiounaite and Vesa Kyll{\"o}nen and Olli Vuorinen and Satu-Marja M{\"a}kel{\"a} and Tommi Ker{\"a}nen and Markus Niiranen and Jouni Knuutinen and Johannes Peltola",
year = "2009",
doi = "10.1109/ACII.2009.5349393",
language = "English",
isbn = "978-1-4244-4800-5",
booktitle = "2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops",
publisher = "IEEE Institute of Electrical and Electronic Engineers",
address = "United States",

}

Vildjiounaite, E, Kyllönen, V, Vuorinen, O, Mäkelä, S-M, Keränen, T, Niiranen, M, Knuutinen, J & Peltola, J 2009, Requirements and software framework for adaptive multimodal affect recognition. in 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. IEEE Institute of Electrical and Electronic Engineers , Piscataway, NJ, USA, 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009, Amsterdam, Netherlands, 10/09/09. https://doi.org/10.1109/ACII.2009.5349393

Requirements and software framework for adaptive multimodal affect recognition. / Vildjiounaite, Elena; Kyllönen, Vesa; Vuorinen, Olli; Mäkelä, Satu-Marja; Keränen, Tommi; Niiranen, Markus; Knuutinen, Jouni; Peltola, Johannes.

2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. Piscataway, NJ, USA : IEEE Institute of Electrical and Electronic Engineers , 2009.

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

TY - GEN

T1 - Requirements and software framework for adaptive multimodal affect recognition

AU - Vildjiounaite, Elena

AU - Kyllönen, Vesa

AU - Vuorinen, Olli

AU - Mäkelä, Satu-Marja

AU - Keränen, Tommi

AU - Niiranen, Markus

AU - Knuutinen, Jouni

AU - Peltola, Johannes

PY - 2009

Y1 - 2009

N2 - This work presents a software framework for real time multimodal affect recognition. The framework supports categorical emotional models and simultaneous classification of emotional states along different dimensions. The framework also allows to incorporate diverse approaches to multimodal fusion, proposed by the current state of the art, as well as to adapt to context-dependency of expressing emotions and to different application requirements. The results of using the framework in audio-video based emotion recognition of an audience of different shows (this is a useful information because emotions of co-located people affect each other) confirm the capability of the framework to provide desired functionalities conveniently and demonstrate that use of contextual information increases recognition accuracy. (21 refs.)

AB - This work presents a software framework for real time multimodal affect recognition. The framework supports categorical emotional models and simultaneous classification of emotional states along different dimensions. The framework also allows to incorporate diverse approaches to multimodal fusion, proposed by the current state of the art, as well as to adapt to context-dependency of expressing emotions and to different application requirements. The results of using the framework in audio-video based emotion recognition of an audience of different shows (this is a useful information because emotions of co-located people affect each other) confirm the capability of the framework to provide desired functionalities conveniently and demonstrate that use of contextual information increases recognition accuracy. (21 refs.)

KW - emotion recognition

KW - labeling

KW - mood

U2 - 10.1109/ACII.2009.5349393

DO - 10.1109/ACII.2009.5349393

M3 - Conference article in proceedings

SN - 978-1-4244-4800-5

BT - 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops

PB - IEEE Institute of Electrical and Electronic Engineers

CY - Piscataway, NJ, USA

ER -

Vildjiounaite E, Kyllönen V, Vuorinen O, Mäkelä S-M, Keränen T, Niiranen M et al. Requirements and software framework for adaptive multimodal affect recognition. In 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. Piscataway, NJ, USA: IEEE Institute of Electrical and Electronic Engineers . 2009 https://doi.org/10.1109/ACII.2009.5349393