left to right: Prof. Kenneth Camilleri – WildEye Project Coordinator, Department of Systems and Control Engineering, University of Malta; Sr Stefania Cristina, WildEye Senior Researcher, Department of Systems and Control Engineering, University of Malta; Mr Kenneth Bone, WildEye Project Manager (Commercial Partner) Seasus Ltd.
The University of Malta in partnership with Seasus Ltd, have developed Wild-Eye, a novel passive eye-gazing tracking platform which provides an alternative communication channel for persons with impaired motor skills. Through this project users will be able to perform mundane activities such as operating a computer, hence improving their quality of life and independence, as well as providing an additional access method for all, permitting an auxiliary control input for computer applications, such as control of IOT devices.
The University of Malta in partnership with Seasus Ltd, have developed Wild-Eye, a novel passive eye-gazing tracking platform which provides an alternative communication channel for persons with impaired motor skills. Through this project users will be able to perform mundane activities such as operating a computer, hence improving their quality of life and independence, as well as providing an additional access method for all, permitting an auxiliary control input for computer applications, such as control of IOT devices.
The €193,943.38 project was funded through the Malta Council for Science and Technology (MCST) FUSION R&I Programme.
Addressing a Public Engagement Event, which was held on 25 January 2022 at the Chaplain’s Hall at the EsploraPlanetarium Building, Kalkara, the Hon. Owen Bonnici, Minister for Equality, Research and Innovation said “This is truly research and innovation at its best, put to really good use. The Wildeye project will definitely leave a concrete, positive impact on people’s daily lives. This is exactly why we will never tire out of affirming the importance of investment in R & I to generate economic wealth for our country, but also and most importantly to truly improve people’s lives.”
Dr Melchior Cini, MCST Deputy Director, R & I Programmes Unit, said “Innovation is all about moving away from processes which have stagnated and are only retained due to a fear of failure when it comes to implementing new approaches. It is about taking risks and being creative. Innovation requires collaboration, implementation, and value creation. Today, we had the opportunity to appreciate the remarkable accomplishments of a consortium composed of competent engineers and industrious academics. Through hard work and focussed research, they shifted the paradigms of existing tracking-application technologies and broke new ground in this area” He added that “MCST is optimistic that our investment in such projects will continue to support those who are motivated and determined enough to find solutions to the complex questions faced by our society.”
Wild-Eye Project Leader said “There is growing interest in the development of technologies that provide alternative communication channels, with eye-gaze tracking being one of the most developed. Our ambition is to develop ubiquitous eye-gaze tracking technology such that users may control devices without requiring them to be sitting in front of a computer.
WildEye is one further step along this road”. In the developed platform, eye and head movements are captured in a stream of image frames acquired by a webcam, and subsequently processed by a computer in order to estimate the eye-gaze direction by combining the eye and head pose components. The estimated eye-gaze is then mapped to the computer screen, to control a graphical user interface that has been purposely developed based on the available gaze estimation accuracy. The graphical user interface features 15 buttons that may be activated by gazing upon them for a preset period of time, or by blinking the eyes to confirm the selection.
WildEye is one further step along this road”. In the developed platform, eye and head movements are captured in a stream of image frames acquired by a webcam, and subsequently processed by a computer in order to estimate the eye-gaze direction by combining the eye and head pose components. The estimated eye-gaze is then mapped to the computer screen, to control a graphical user interface that has been purposely developed based on the available gaze estimation accuracy. The graphical user interface features 15 buttons that may be activated by gazing upon them for a preset period of time, or by blinking the eyes to confirm the selection.
Eye movements have long been recognised to provide an alternative channel to communicate with, or even the possibility to control machines, such as a computer. The ample information inherent in the eye movements has attracted interest through the years, leading to a host of eye-gaze tracking applications in several fields, including assistive communication, driver assistance, and marketing and advertising research.
A panel discussion took place around the field with the participation of the project team members and was moderated by an Esplora Science Communicator. The event was streamed live.