Show simple item record

dc.contributor.authorNunavath, Vimala
dc.contributor.authorJohansen, Sahand
dc.contributor.authorJohannessen, Tommy Sandtorv
dc.contributor.authorJiao, Lei
dc.contributor.authorHansen, Bjørge Hermann
dc.contributor.authorStølevik, Sveinung Berntsen
dc.contributor.authorGoodwin, Morten
dc.date.accessioned2021-12-08T15:01:32Z
dc.date.available2021-12-08T15:01:32Z
dc.date.created2021-09-21T15:04:05Z
dc.date.issued2021-08-18
dc.identifier.citationSensors. 2021, 21 (16), .en_US
dc.identifier.issn1424-8220
dc.identifier.urihttps://hdl.handle.net/11250/2833441
dc.description.abstractPhysical inactivity increases the risk of many adverse health conditions, including the world’s major non-communicable diseases, such as coronary heart disease, type 2 diabetes, and breast and colon cancers, shortening life expectancy. There are minimal medical care and personal trainers’ methods to monitor a patient’s actual physical activity types. To improve activity monitoring, we propose an artificial-intelligence-based approach to classify physical movement activity patterns. In more detail, we employ two deep learning (DL) methods, namely a deep feed-forward neural network (DNN) and a deep recurrent neural network (RNN) for this purpose. We evaluate the two models on two physical movement datasets collected from several volunteers who carried tri-axial accelerometer sensors. The first dataset is from the UCI machine learning repository, which contains 14 different activities-of-daily-life (ADL) and is collected from 16 volunteers who carried a single wrist-worn tri-axial accelerometer. The second dataset includes ten other ADLs and is gathered from eight volunteers who placed the sensors on their hips. Our experiment results show that the RNN model provides accurate performance compared to the state-of-the-art methods in classifying the fundamental movement patterns with an overall accuracy of 84.89% and an overall F1-score of 82.56%. The results indicate that our method provides the medical doctors and trainers a promising way to track and understand a patient’s physical activities precisely for better treatment.en_US
dc.language.isoengen_US
dc.publisherMDPIen_US
dc.relation.ispartofseriesSensors;Volume 21 / Issue 16
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.subjectDeep learningen_US
dc.subjectHelseen_US
dc.subjectHealthen_US
dc.subjectPhysical activityen_US
dc.subjectFysisk aktiviteteren_US
dc.subjectPhysical activitiesen_US
dc.subjectClassificationsen_US
dc.subjectMachine learningen_US
dc.subjectAccelerometer dataen_US
dc.subjectSensorsen_US
dc.subjectFeed-forward neural networksen_US
dc.subjectRecurrent neural networksen_US
dc.titleDeep Learning for Classifying Physical Activities from Accelerometer Dataen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.rights.holder© 2021 by the authors.en_US
dc.source.articlenumber5564en_US
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.doihttps://doi.org/10.3390/s21165564
dc.identifier.cristin1936674
dc.source.journalSensorsen_US
dc.source.volume21en_US
dc.source.issue16en_US
dc.source.pagenumber28en_US
dc.subject.nsiVDP::Informasjons- og kommunikasjonsteknologi: 550en_US
dc.subject.nsiVDP::Information and communication technology: 550en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Navngivelse 4.0 Internasjonal
Except where otherwise noted, this item's license is described as Navngivelse 4.0 Internasjonal