Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/95123
Title: A hybrid motion capture approach for high-speed movement
Authors: Spiteri, Kevin (2014)
Keywords: Human-computer interaction
Machine learning
Video games
Issue Date: 2014
Citation: Spiteri, K. (2014). A hybrid motion capture approach for high-speed movement (Bachelor's dissertation).
Abstract: Interaction with digital video games has nowadays gone beyond the traditional use of input devices. With many inexpensive off the shelf motion sensors available, an enhanced interaction experience is now easier to achieve. In this dissertation, a number of motion capture devices have been used to design and develop a Sensor Fusion Framework (SFF). This framework, makes use of several techniques including dynamic weighting, stiff joints and object tracking, to process the data obtained from the various devices to build and improve the animation performed by the actor. A number of scenarios consisting of different situations have been set up with the aim to evaluate specific techniques. From these scenarios, it was evident that when occlusion occurs, the proposed framework has shown improvements when compared to the base sensor and was also quite close to the actual movements performed by the actor. On the other hand when speed and precision were required, and occlusion did not occur, the base sensor alone has proven to be much more accurate and consistent. This was mainly due to the fact that the framework suffered from latency with certain Bluetooth devices and additional computational costs are incurred to process the different streams of data.
Description: B.Sc. IT (Hons)(Melit.)
URI: https://www.um.edu.mt/library/oar/handle/123456789/95123
Appears in Collections:Dissertations - FacICT - 2014
Dissertations - FacICTCS - 2010-2015

Files in This Item:
File Description SizeFormat 
BSC(HONS)ICT_Spiteri, Kevin_2014.PDF
  Restricted Access
9.26 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.