Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/91734
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Barbara, Nathaniel | - |
dc.contributor.author | Camilleri, Tracey A. | - |
dc.contributor.author | Camilleri, Kenneth P. | - |
dc.date.accessioned | 2022-03-18T06:59:00Z | - |
dc.date.available | 2022-03-18T06:59:00Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Barbara, N., Camilleri, T. A., & Camilleri, K. P. (2019). EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard. Biomedical Signal Processing and Control, 47, 159-167. | en_GB |
dc.identifier.uri | https://www.um.edu.mt/library/oar/handle/123456789/91734 | - |
dc.description.abstract | This work aims to develop a novel electrooculography (EOG)-based virtual keyboard with a standard QWERTY layout which, unlike similar state-of-the-art systems, allows users to reach any icon from any location directly and asynchronously. The saccadic EOG potential displacement is mapped to angular gaze displacement using a novel two-channel input linear regression model, which considers features extracted from both the horizontal and vertical EOG signal components jointly. Using this technique, a gaze displacement estimation error of 1.32 ± 0.26° and 1.67 ± 0.26° in the horizontal and vertical directions respectively was achieved, a performance which was also found to be generally statistically significantly better than the performance obtained using one model for each EOG component to model the relationship in the horizontal and vertical directions separately, as typically used in the literature. Furthermore, this work also proposes a threshold-based method to detect eye movements from EOG signals in real-time, which are then classified as saccades or blinks using a novel cascade of a parametric and a signal-morphological classifier based on the EOG peak and gradient features. This resulted in an average saccade and blink labelling accuracy of 99.92% and 100.00% respectively, demonstrating that these two eye movements could be reliably detected and discriminated in real-time using the proposed algorithms. When these techniques were used to interface with the proposed asynchronous EOG-based virtual keyboard, an average writing speed across subjects of 11.89 ± 4.42 characters per minute was achieved, a performance which has been shown to improve substantially with user experience. | en_GB |
dc.language.iso | en | en_GB |
dc.publisher | Elsevier | en_GB |
dc.rights | info:eu-repo/semantics/restrictedAccess | en_GB |
dc.subject | Electrooculography | en_GB |
dc.subject | Human-computer interaction | en_GB |
dc.subject | Wireless communication systems | en_GB |
dc.subject | Keyboards (Electronics) | en_GB |
dc.subject | Eye -- Movements | en_GB |
dc.title | EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard | en_GB |
dc.type | article | en_GB |
dc.rights.holder | The copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder. | en_GB |
dc.description.reviewed | peer-reviewed | en_GB |
dc.identifier.doi | 10.1016/j.bspc.2018.07.005 | - |
dc.publication.title | Biomedical Signal Processing and Control | en_GB |
Appears in Collections: | Scholarly Works - FacEngSCE |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard.pdf Restricted Access | 2.02 MB | Adobe PDF | View/Open Request a copy |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.