Vis enkel innførsel

dc.contributor.advisorRego Lencastre e Silva, Pedro
dc.contributor.advisorLind, Pedro
dc.contributor.authorNaval Ruiz, Arnau
dc.date.accessioned2023-11-07T14:38:08Z
dc.date.available2023-11-07T14:38:08Z
dc.date.issued2023
dc.identifier.urihttps://hdl.handle.net/11250/3101180
dc.description.abstractEye-gaze forecasting is a field with a significant number of applications, such as User Interface analysis or improving self-driving cars. Despite its importance, this type of data can be hard to come by due to laws protecting users’ data. Therefore, we use a vanilla Transformer and Informer, a transformer-based model focused on time series forecasting, to generate realistic artificial data that can be used for further research, trained from eye-tracking data recorded at OsloMet. In order to validate the quality of the results we generate histograms for the distribution of the velocities for the positions and angles between points, as well as the autocorrelation, this analysis is compared against the results of a simple linear model and a Markov model. This study, conducted with limited data which can affect the generalization capabilities of the larger models, finds that the well-established mathematical model significantly outperforms the Deep Learning models. Such results indicate that the transformer-based models utilized may not be adequate for such a task.en_US
dc.language.isoengen_US
dc.publisherOslomet - storbyuniversiteteten_US
dc.titleGenerating realistic eye-tracking data with Transformersen_US
dc.typeMaster thesisen_US
dc.description.versionpublishedVersionen_US


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel