Abstract
The past decade has witnessed a significant increase in interest in human emotional behaviours in the future of interactive multimodal computing. Although much consideration has been given to non-interactive affective stimuli (e.g., images and videos), the recognition of emotions within interactive virtual environments has not received an equal level of attention. In the present study, a psychophysiological database, cataloguing the EEG, GSR and heart rate of 30 participants, exposed to an affective virtual environment, has been constructed. 743 features were extracted from the physiological signals. Then, by employing a feature selection technique, the dimensionality of the feature space was reduced to a smaller subset, containing only 30 features. Four classification techniques (KNN, SVM, Discriminant Analysis (DA) and Classification Tree) were employed to classify the affective psychophysiological database into four Affective Clusters (derived from a Valence-Arousal space) and eight Emotion Labels. By employing cross-validation techniques, the performances of more than a quarter of a million different classification settings (various window lengths, classifier settings, etc.) were investigated. The results suggested that the physiological signals could be employed to classify emotional experiences, with high precision. The KNN and SVM outperformed both Classification Tree and DA classifiers; with 97.01 percent and 92.84 percent mean accuracies, respectively.
Original language | English |
---|---|
Article number | 8078217 |
Pages (from-to) | 45-62 |
Number of pages | 18 |
Journal | IEEE Transactions on Affective Computing |
Volume | 11 |
Issue number | 1 |
DOIs | |
State | Published - 1 Jan 2020 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2010-2012 IEEE.
Keywords
- Virtual reality
- affective VR
- affective computing
- emotion-based affective physiological database
- interactive environments
ASJC Scopus subject areas
- Software
- Human-Computer Interaction