A comparison of identification models for human emotions as reactions to a dynamical virtual face
Date |
---|
2014 |
The report presents the research of dependencies between human emotional signals (excitement and frustration) and virtual dynamical stimulus (virtual 3D face with changing distance between eyes). The dependencies are described by linear “input-output” type models. Emotional signals are received using EEG measuring device Emotiv Epoc with 14 channels. Recorded signals are pre-processed automatically using the device and specialized software and relative values of emotional signals are used for modelling. Two model identification methods are proposed – modified correlation-based and the least squares method with projection to a stability domain what ensures building of stable models. Identification was performed using two types of stimuli for nine volunteers. Validation results of the models showed, that they predict emotional signals with a relatively high accuracy – absolute relative error of excitement prediction does not exceed 9% and absolute relative error of frustration prediction does not exceed 3%.