Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/49132
Full metadata record
DC FieldValueLanguage
dc.date.accessioned2019-11-29T12:13:39Z-
dc.date.available2019-11-29T12:13:39Z-
dc.date.issued2019-
dc.identifier.citationTabone, A. (2019). Automated page turner for musicians (Bachelor's dissertation).en_GB
dc.identifier.urihttps://www.um.edu.mt/library/oar/handle/123456789/49132-
dc.descriptionB.ENG.(HONS)en_GB
dc.description.abstractPage turning has at some point frustrated every musician when having to abandon the performance temporarily to manually turn the physical page. During a performance, the pianist would be using both hands to play the instrument and so, turning the page means that one would have to quickly lift at least one hand from the keyboard to perform such a task. This may lead to various types of performance errors and every musician has to learn and develop his/her own method to overcome this annoyance; in fact, good music book editors edit the music in such a way that there is a long note, or a pause towards the end of the page, where a page turn is necessary. The objective of this dissertation is to develop an automated page turner that tracks the user’s progression on the music score using an eye gaze tracking system. To successfully design and implement the fully automated page turner, it was first sought to understand how the musician interacts with the score. This led to a data collection and processing stage on which the design criteria of the page turning application would then be set. Future improvements based on the outcomes of this project would include the use of a tablet camera to record eye gaze and the tablet screen to display music. With this in mind, the score was immediately divided into pages with only two lines of music as described in [1]. Under such conditions, half-page turns are implemented replacing individual lines of music. This music format was presented to the test subjects, whose eye gaze values and performance were recorded using the eye gaze sensor system and a digital piano respectively. From the information collected it was observed that sensor deviations and instances where the musician looks down at the keyboard cause the tracker to lose track of the eyes and return redundant or zero values. Hence, the system was to include a Kalman filter to smooth out the readings. The sensitive areas on the script were to be set by using image processing to detect the bar lines such that an understanding of the temporal structure of the script could be obtained. The values obtained from the eye gaze tracker are of a different resolution to the information obtained from the score processing in terms of resolution and screen utilisation. The tracker system returned coordinates ranging throughout the whole screen, whilst the image values were based inside a figure, so a scaling and compensating function was required to ensure the coordinate values were relevant to each other. These functions were tested separately and integrated into a single application using Matlab’s GUIDE environment. Initially, the page turning application was tested and results presented a minimal number of redundant page turns but upon inspection, they were all linked to the same series of events of the musician’s performance. By tuning the sizes of the sensitive areas and the resistance towards triggering page turns, a better system was achieved. Instances when the sensor loses track of the eyes for a long period of time due to the user glancing at the keyboard or shifting to play at the extremities of the piano, caused most of the observed problems. These instances were tackled by replacing the eye gaze inputs to inputs based off a model and the previous positions which estimate how the user would have looked. The result was a page turner with an accuracy of 98.27% that suffered only from delayed page turns and removed instances where its previous version would have triggered an early page turn causing the replacement of the current line being performed.en_GB
dc.language.isoenen_GB
dc.rightsinfo:eu-repo/semantics/restrictedAccessen_GB
dc.subjectEye trackingen_GB
dc.subjectKalman filteringen_GB
dc.subjectMusiciansen_GB
dc.titleAutomated page turner for musiciansen_GB
dc.typebachelorThesisen_GB
dc.rights.holderThe copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder.en_GB
dc.publisher.institutionUniversity of Maltaen_GB
dc.publisher.departmentFaculty of Engineering. Department of Systems & Control Engineeringen_GB
dc.description.reviewedN/Aen_GB
dc.contributor.creatorTabone, André-
Appears in Collections:Dissertations - FacEng - 2019
Dissertations - FacEngSCE - 2019

Files in This Item:
File Description SizeFormat 
19BENGEE21_ Tabone Andre.pdf
  Restricted Access
3.2 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.