Vis enkel innførsel

dc.contributor.authorZhong, Xiaolong
dc.contributor.authorYin, Zhong
dc.contributor.authorZhang, Jianhua
dc.date.accessioned2021-09-01T07:32:55Z
dc.date.available2021-09-01T07:32:55Z
dc.date.created2020-10-07T11:59:12Z
dc.date.issued2020-09-09
dc.identifier.citationChinese Control Conference (CCC). 2020, 2020- 7516-7521.en_US
dc.identifier.isbn978-9-8815-6390-3
dc.identifier.isbn978-1-7281-6523-3
dc.identifier.issn1934-1768
dc.identifier.urihttps://hdl.handle.net/11250/2772110
dc.description.abstractUsing electroencephalogram (EEG) signals for emotion detection has aroused widespread research concern. However, across subjects emotional recognition has become an insurmountable gap which researchers cannot step across for a long time due to the poor generalizability of features across subjects. In response to this difficulty, in this study, the moving average (MA) technology is introduced to smooth out short-term fluctuations and highlight longer-term trends or cycles. Based on the MA technology, an effective method for cross-subject emotion recognition was then developed, which designed a method of salient region extraction based on attention mechanism, with the purpose of enhancing the capability of representations generated by a network by modelling the interdependencies between the channels of its informative features. The effectiveness of our method was validated on a dataset for emotion analysis using physiological signals (DEAP) and the MAHNOB-HCI multimodal tagging database. Compared with recent similar works, the method developed in this study for emotion recognition across all subjects was found to be effective, and its accuracy was 66.23% for valence and 68.50% for arousal (DEAP) and 70.25% for valence and 73.27% for arousal (MAHNOB) on the Gamma frequency band. And benefiting from the strong representational learning capacity in the two-dimensional space, our method is efficient in emotion recognition especially on Beta and Gamma waves.en_US
dc.language.isoengen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.ispartof2020 39th Chinese Control Conference (CCC)
dc.relation.ispartofseriesChinese Control Conference (CCC);2020 39th Chinese Control Conference (CCC)
dc.subjectEmotional recognitionen_US
dc.subjectPhysiological signalsen_US
dc.subjectDeep learningen_US
dc.subjectClassificationen_US
dc.subjectMachine learningen_US
dc.subjectCross-subjectsen_US
dc.titleCross-Subject emotion recognition from EEG using Convolutional Neural Networksen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionacceptedVersionen_US
dc.rights.holder© 2020 IEEEen_US
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.fulltextpostprint
cristin.qualitycode1
dc.identifier.doihttps://doi.org/10.23919/CCC50068.2020.9189559
dc.identifier.cristin1837877
dc.source.journalChinese Control Conference (CCC)en_US
dc.source.volume2020-en_US
dc.source.issue2020 39th Chinese Control Conference (CCC)en_US
dc.source.pagenumber7516-7521en_US


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel