Show simple item record

dc.contributor.authorNikolov, Nikolay
dc.contributor.authorDessalk, Yared Dejene
dc.contributor.authorKhan, Akif Quddus
dc.contributor.authorSoylu, Ahmet
dc.contributor.authorMatskin, Mihhail
dc.contributor.authorPayberah, Amir
dc.contributor.authorRoman, Dumitru
dc.date.accessioned2022-03-07T12:45:30Z
dc.date.available2022-03-07T12:45:30Z
dc.date.created2021-11-04T11:18:06Z
dc.date.issued2021-11-26
dc.identifier.citationInternet of Things. 2021, 16 1-19.en_US
dc.identifier.issn2542-6605
dc.identifier.urihttps://hdl.handle.net/11250/2983427
dc.description.abstractBig Data processing, especially with the increasing proliferation of Internet of Things (IoT) technologies and convergence of IoT, edge and cloud computing technologies, involves handling massive and complex data sets on heterogeneous resources and incorporating different tools, frameworks, and processes to help organizations make sense of their data collected from various sources. This set of operations, referred to as Big Data workflows, requires taking advantage of Cloud infrastructures’ elasticity for scalability. In this article, we present the design and prototype implementation of a Big Data workflow approach based on the use of software container technologies, message-oriented middleware (MOM), and a domain-specific language (DSL) to enable highly scalable workflow execution and abstract workflow definition. We demonstrate our system in a use case and a set of experiments that show the practical applicability of the proposed approach for the specification and scalable execution of Big Data workflows. Furthermore, we compare our proposed approach’s scalability with that of Argo Workflows – one of the most prominent tools in the area of Big Data workflows – and provide a qualitative evaluation of the proposed DSL and overall approach with respect to the existing literature.en_US
dc.description.sponsorshipThis work was partly funded by the EC H2020 projects ‘‘DataCloud: Enabling The Big Data Pipeline Lifecycle on the Computing Continuum’’ (Grant nr. 101016835) and ‘‘COGNITWIN: Cognitive plants through proactive self-learning hybrid digital twins’’ (Grant nr. 870130), and the NFR project ‘‘BigDataMine’’ (Grant nr. 309691).en_US
dc.language.isoengen_US
dc.publisherElsevieren_US
dc.relation.ispartofseriesInternet of Things;Volume 16, December 2021, 100440
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.subjectInternet of Thingsen_US
dc.subjectBig Dataen_US
dc.subjectBig data workflowsen_US
dc.subjectDomain-specific languagesen_US
dc.subjectSoftware containersen_US
dc.titleConceptualization and scalable execution of big data workflows using domain-specific languages and software containersen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.rights.holder© 2021 The Authorsen_US
dc.source.articlenumber100440en_US
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.doihttps://doi.org/10.1016/j.iot.2021.100440
dc.identifier.cristin1951355
dc.source.journalInternet of Thingsen_US
dc.source.volume16en_US
dc.source.pagenumber1-19en_US
dc.relation.projectNorges forskningsråd: 309691en_US
dc.relation.projectEC/H2020/870130en_US
dc.relation.projectEC/H2020/101016835en_US
dc.subject.nsiVDP::Datateknologi: 551en_US
dc.subject.nsiVDP::Computer technology: 551en_US
dc.subject.nsiVDP::Datateknologi: 551en_US
dc.subject.nsiVDP::Computer technology: 551en_US
dc.subject.nsiVDP::Datateknologi: 551en_US
dc.subject.nsiVDP::Computer technology: 551en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Navngivelse 4.0 Internasjonal
Except where otherwise noted, this item's license is described as Navngivelse 4.0 Internasjonal