Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/93315
Title: Facial expressions and lip synchronisation to speech
Authors: Cachia, Kenneth (2006)
Keywords: Computer animation
Face perception
Facial expression
Issue Date: 2006
Citation: Cachia, K. (2006). Facial expressions and lip synchronisation to speech (Bachelor's dissertation).
Abstract: Every human face is unique, in terms of shape, size and features. However, all these faces share the same movements when showing particular expressions or emotions. This also holds during speech, as we can automatically link facial movements to particular sounds in a language. Facial animation is an area in computer animation that tries to model real faces in a detailed manner. The first step is to model the shape of the human head, which can then be used to model facial expressions, movements and emotions. Speech can also be simulated using these systems, where lip movement is synchronised to a sound file. In this dissertation, we propose a system which is able to animate any static 3D model of a human head. First, we model a set of facial components which are used to simulate our facial anatomy. Once these components are attached to a 3D model, expressions and movements can be created. Given a sentence in English, the system generates lip synchronisation and adds additional expressions which are then automatically synchronised to speech to complete the animation.
Description: B.Sc. IT (Hons)(Melit.)
URI: https://www.um.edu.mt/library/oar/handle/123456789/93315
Appears in Collections:Dissertations - FacICT - 1999-2009
Dissertations - FacICTCS - 1999-2007

Files in This Item:
File Description SizeFormat 
B.SC.(HONS)IT_Cachia_Kenneth_2006.pdf
  Restricted Access
26.26 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.