Huldra: a framework for collecting crowdsourced feedback on multimedia assets
Hammou, Malek; Midoglu, Cise; Hicks, Steven; Storås, Andrea; Sabet, Saeed; Strumke, Inga; Riegler, Michael; Halvorsen, Pål
Conference object
Published version
Permanent lenke
https://hdl.handle.net/11250/3051936Utgivelsesdato
2022Metadata
Vis full innførselSamlinger
Originalversjon
https://doi.org/10.1145/3524273.3532887Sammendrag
Collecting crowdsourced feedback to evaluate, rank, or score multimedia content can be cumbersome and time-consuming. Most of the existing survey tools are complicated, hard to customize, or tailored for a specific asset type. In this paper, we present an open source framework called Huldra, designed explicitly to address the challenges associated with user studies involving crowdsourced feedback collection. The web-based framework is built in a modular and configurable fashion to allow for the easy adjustment of the user interface (UI) and the multimedia content, while providing integrations with reliable and stable backend solutions to facilitate the collection and analysis of responses. Our proposed framework can be used as an online survey tool by researchers working on different topics such as Machine Learning (ML), audio, image, and video quality assessment, Quality of Experience (QoE), and require user studies for the benchmarking of various types of multimedia content.