Vis enkel innførsel

dc.contributor.authorMohan, Karnati
dc.contributor.authorSeal, Ayan
dc.contributor.authorKrejcar, Ondrej
dc.contributor.authorYazidi, Anis
dc.date.accessioned2021-02-01T22:16:18Z
dc.date.accessioned2021-03-11T11:22:49Z
dc.date.available2021-02-01T22:16:18Z
dc.date.available2021-03-11T11:22:49Z
dc.date.issued2020-10-16
dc.identifier.citationMohan, Seal, Krejcar, Yazidi. Facial Expression Recognition Using Local Gravitational Force Descriptor-Based Deep Convolution Neural Networks. IEEE Transactions on Instrumentation and Measurement. 2020en
dc.identifier.issn0018-9456
dc.identifier.issn1557-9662
dc.identifier.urihttps://hdl.handle.net/10642/10003
dc.description.abstractAn image is worth a thousand words; hence, a face image illustrates extensive details about the specification, gender, age, and emotional states of mind. Facial expressions play an important role in community-based interactions and are often used in the behavioral analysis of emotions. Recognition of automatic facial expressions from a facial image is a challenging task in the computer vision community and admits a large set of applications, such as driver safety, human–computer interactions, health care, behavioral science, video conferencing, cognitive science, and others. In this work, a deep-learning-based scheme is proposed for identifying the facial expression of a person. The proposed method consists of two parts. The former one finds out local features from face images using a local gravitational force descriptor, while, in the latter part, the descriptor is fed into a novel deep convolution neural network (DCNN) model. The proposed DCNN has two branches. The first branch explores geo-metric features, such as edges, curves, and lines, whereas holistic features are extracted by the second branch. Finally, the score-level fusion technique is adopted to compute the final classifica-tion score. The proposed method along with 25 state-of-the-art methods is implemented on five benchmark available databases, namely, Facial Expression Recognition 2013, Japanese Female Facial Expressions, Extended CohnKanade, Karolinska Directed Emotional Faces, and Real-world Affective Faces. The data-bases consist of seven basic emotions: neutral, happiness, anger, sadness, fear, disgust, and surprise. The proposed method is compared with existing approaches using four evaluation metrics, namely, accuracy, precision, recall, and f1-score. The obtained results demonstrate that the proposed method outperforms all state-of-the-art methods on all the databases.en
dc.description.sponsorshipThis work was supported in part by the project “Prediction of Diseases Through Computer Assisted Diagnosis System Using Images Captured by Minimally Invasive and Noninvasive Modalities,” Computer Science and Engineering, PDPM Indian Institute of Information Technology, Design and Manufacturing Jabalpur, Jabalpur, India, under Grant ID: SPARC-MHRD-231, in part by the project IT4Neuro(degeneration) under Grant CZ.02.1.01/0.0/0.0/18 069/0010054, in part by the project “Smart Solutions in Ubiquitous Computing Environments,” Grant Agency of Excellence, University of Hradec Kralove, Faculty of Informatics and Management, Czech Republic under Grant ID: UHK-FIM-GE-2020, in part by the project at Universiti Teknologi Malaysia (UTM) under Research University Grant Vot-20H04, in part by the Malaysia Research University Network (MRUN) under Grant Vot 4L876, and in part by the Fundamental Research Grant Scheme (FRGS) under the Ministry of Education Malaysia under Grant Vot5F073.en
dc.language.isoenen
dc.publisherInstitute of Electrical and Electronics Engineersen
dc.relation.ispartofseriesIEEE Transactions on Instrumentation and Measurement;Volume 70
dc.subjectDeep convolution neural networksen
dc.subjectFacial expression recognitionen
dc.subjectLocal gravitational forcesen
dc.subjectDescriptorsen
dc.subjectScore-level fusionsen
dc.subjectSoftmax classificationsen
dc.titleFacial Expression Recognition Using Local Gravitational Force Descriptor-Based Deep Convolution Neural Networksen
dc.typeJournal articleen
dc.typePeer revieweden
dc.date.updated2021-02-01T22:16:17Z
dc.description.versionpublishedVersionen
dc.identifier.doihttps://doi.org/10.1109/TIM.2020.3031835
dc.identifier.cristin1885547
dc.source.journalIEEE Transactions on Instrumentation and Measurement


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel