Vis enkel innførsel

dc.contributor.authorJha, Debesh
dc.contributor.authorAli, Sharib
dc.contributor.authorTomar, Nikhil Kumar
dc.contributor.authorJohansen, Håvard D.
dc.contributor.authorJohansen, Dag
dc.contributor.authorRittscher, Jens
dc.contributor.authorRiegler, Michael
dc.contributor.authorHalvorsen, Pål
dc.date.accessioned2022-08-16T11:54:33Z
dc.date.available2022-08-16T11:54:33Z
dc.date.created2021-09-20T12:28:42Z
dc.date.issued2021-03-04
dc.identifier.citationIEEE Access. 2021, 9 40496-40510.en_US
dc.identifier.issn2169-3536
dc.identifier.urihttps://hdl.handle.net/11250/3012104
dc.description.abstractComputer-aided detection, localization, and segmentation methods can help improve colonoscopy procedures. Even though many methods have been built to tackle automatic detection and segmentation of polyps, benchmarking of state-of-the-art methods still remains an open problem. This is due to the increasing number of researched computer vision methods that can be applied to polyp datasets. Benchmarking of novel methods can provide a direction to the development of automated polyp detection and segmentation tasks. Furthermore, it ensures that the produced results in the community are reproducible and provide a fair comparison of developed methods. In this paper, we benchmark several recent state-of-the-art methods using Kvasir-SEG, an open-access dataset of colonoscopy images for polyp detection, localization, and segmentation evaluating both method accuracy and speed. Whilst, most methods in literature have competitive performance over accuracy, we show that the proposed ColonSegNet achieved a better trade-off between an average precision of 0.8000 and mean IoU of 0.8100, and the fastest speed of 180 frames per second for the detection and localization task. Likewise, the proposed ColonSegNet achieved a competitive dice coefficient of 0.8206 and the best average speed of 182.38 frames per second for the segmentation task. Our comprehensive comparison with various state-of-the-art methods reveals the importance of benchmarking the deep learning methods for automated real-time polyp identification and delineations that can potentially transform current clinical practices and minimise miss-detection rates.en_US
dc.description.sponsorshipDebesh Jha is funded by the Research Council of Norway project number 263248 (Privaton). The computations in this paper were performed on equipment provided by the Experimental Infrastructure for Exploration of Exascale Computing (eX3), which is financially supported by the Research Council of Norway under contract 270053. Parts of computational resources were also used from the research supported by the National Institute for Health Research (NIHR) Oxford BRC with additional support from the Wellcome Trust Core Award Grant Number 203141/Z/16/Z. Sharib Ali is supported by the NIHR Oxford Biomedical Research Centre.en_US
dc.language.isoengen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.ispartofseriesIEEE Access;Volume: 9, 2021
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.subjectMedical image segmentationen_US
dc.subjectColonSegNeten_US
dc.subjectColonoscopyen_US
dc.subjectPolypsen_US
dc.subjectDetectionen_US
dc.subjectLocalizationen_US
dc.subjectKvasir-SEGen_US
dc.titleReal-Time Polyp Detection, Localization and Segmentation in Colonoscopy Using Deep Learningen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.doihttps://doi.org/10.1109/ACCESS.2021.3063716
dc.identifier.cristin1935966
dc.source.journalIEEE Accessen_US
dc.source.volume9en_US
dc.source.issue9en_US
dc.source.pagenumber40496-40510en_US
dc.relation.projectNational Institutes of Health: 203141en_US
dc.relation.projectNorges forskningsråd: 270053en_US
dc.relation.projectNorges forskningsråd: 263248en_US


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Navngivelse 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Navngivelse 4.0 Internasjonal