Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/107543
Full metadata record
DC FieldValueLanguage
dc.date.accessioned2023-03-21T14:53:07Z-
dc.date.available2023-03-21T14:53:07Z-
dc.date.issued2022-
dc.identifier.citationBarbara, N. (2022). EOG-based gaze angle estimation with varying head pose (Doctoral dissertation).en_GB
dc.identifier.urihttps://www.um.edu.mt/library/oar/handle/123456789/107543-
dc.descriptionPh.D.(Melit.)en_GB
dc.description.abstractThis work addresses two major challenges in the field of electrooculography (EOG) signal processing, that of baseline drift and gaze estimation considering both stationary and non-stationary head conditions. Although several different baseline drift mitigation techniques have been proposed, the choice of technique and corresponding parameters is generally arbitrary. To this end, this work carries out a systematic performance analysis and applies these different techniques to the same recorded EOG data. This analysis has demonstrated that frequent resetting is the overall best-performing, followed by signal differencing, wavelet decomposition, high-pass filtering and polynomial fitting. To address the challenge of EOG-based gaze estimation, this work has adapted and investigated the use of a published battery model of the eye. When this was used on offline baseline drift-mitigated EOG data, a horizontal and vertical gaze angle (GA) estimation error of 2.23±0.48° and 2.39±0.54°, respectively, was obtained, which compared well with the 2.13±0.41° and 2.30±0.53° errors obtained using the state-of-the-art two-bipolar-channel input linear regression models. However, in contrast to such black-box regression models, the battery model is an explicit, anatomically-driven model which makes it easier to model more complex ocular behaviour. This work has also proposed the use of the battery model in a novel offline baseline drift mitigation technique which exploits knowledge of the targets which the subject attended to during EOG signal acquisition. Unlike the state-of-the-art-methods, this does not require the data to be zero-centred nor does it disrupt the EOG signal morphology. This technique was shown to yield a generally superior performance when compared to the existing techniques. The battery model is further augmented to represent the blink-related eyelid-induced shunting and this is used to dynamically model fixations, saccades and blinks within a multiple-model GA estimation framework while simultaneously handling the baseline drift in real-time. When applied to short data segments, a horizontal and vertical GA estimation error of 1.64±0.82° and 1.97±0.34°, respectively, was obtained, which compared well with the 1.51±0.55° and 1.95±0.29° errors obtained using the state-of-the-art method, whereas the proposed method resulted in a statistically significantly superior GA estimation performance on longer data segments. This framework achieved a global eye movement detection and labelling F-score exceeding 90%. This work also lifts the stationary head constraint that has been generally enforced so far in the literature. Specifically, this work generalises the two-eye verging gaze geometry to cater for an arbitrary head pose and position, and also models vestibulo-ocular reflexes (VORs) in the proposed multiple-model framework. Using the proposed method, a horizontal and vertical GA estimation error of 1.85±0.51° and 2.19±0.62°, respectively, and an eye movement detection and labelling F-score of approximately 90% was obtained.en_GB
dc.language.isoenen_GB
dc.rightsinfo:eu-repo/semantics/restrictedAccessen_GB
dc.subjectElectrooculographyen_GB
dc.subjectSignal processingen_GB
dc.subjectWavelets (Mathematics)en_GB
dc.subjectEye -- Movementsen_GB
dc.subjectHuman-computer interactionen_GB
dc.subjectEye trackingen_GB
dc.subjectGazeen_GB
dc.subjectRegression analysisen_GB
dc.titleEOG-based gaze angle estimation with varying head poseen_GB
dc.typedoctoralThesisen_GB
dc.rights.holderThe copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder.en_GB
dc.publisher.institutionUniversity of Maltaen_GB
dc.publisher.departmentFaculty of Engineeringen_GB
dc.description.reviewedN/Aen_GB
dc.contributor.creatorBarbara, Nathaniel (2022)-
Appears in Collections:Dissertations - FacEng - 2022

Files in This Item:
File Description SizeFormat 
No Access.pdf77.75 kBAdobe PDFView/Open


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.