Comparative analysis of physiological signals and electroencephalogram (EEG) for multimodal emotion recognition using generative models

Cristian A. Torres-Valencia, Hernan F. Garcia-Arias, Mauricio A.Alvarez Lopez, Alvaro A. Orozco-Gutierrez

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

43 Scopus citations

Abstract

Multimodal Emotion recognition (MER) is an application of machine learning were different biological signals are used in order to automatically classify a determined affective state. MER systems has been developed for different type of applications from psychological evaluation, anxiety assessment, human-machine interfaces and marketing. There are several spaces of classification proposed in the state of art for the emotion recognition task, the most known are discrete and dimensional spaces were the emotions are described in terms of some basic emotions and latent dimensions respectively. The use of dimensional spaces of classification allows a higher range of emotional states to be analyzed. The most common dimensional space used for this purpose is the Arousal/Valence space were emotions are described in terms of the intensity of the emotion that goes from inactive to active in the arousal dimension, and from unpleasant to pleasant in the valence dimension. The use of physiological signals and the EEG is well suited for emotion recognition due to the fact that an emotional states generates responses from different biological systems of the human body. Since the expression of an emotion is a dynamic process, we propose the use of generative models as Hidden Markov Models (HMM) to capture de dynamics of the signals for further classification of emotional states in terms of arousal and valence. For the development of this work an international database for emotion classification known as Dataset for Emotion Analysis using Physiological signals (DEAP) is used. The objective of this work is to determine which of the physiological and EEG signals brings more relevant information in the emotion recognition task, several experiments using HMMs from different signals and combinations of them are performed, and the results shows that some of those signals brings more discrimination between arousal and valence levels as the EEG and the Galvanic Skin Response (GSR) and the Heart rate (HR).

Original languageEnglish
Title of host publication2014 19th Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781479976669
DOIs
StatePublished - 14 Jan 2015
Externally publishedYes
Event2014 19th Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2014 - Armenia-Quindio, Colombia
Duration: 17 Sep 201419 Sep 2014

Publication series

Name2014 19th Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2014

Conference

Conference2014 19th Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2014
Country/TerritoryColombia
CityArmenia-Quindio
Period17/09/1419/09/14

Fingerprint

Dive into the research topics of 'Comparative analysis of physiological signals and electroencephalogram (EEG) for multimodal emotion recognition using generative models'. Together they form a unique fingerprint.

Cite this