Abstract
In this work we propose a talking head system for animating facial expressions using a template face generated from partial differential equations (PDE). It uses a set of pre configured curves to calculate an internal template surface face. This surface is then used to associate various facial features with a given 3D face object. Motion retargeting is then used to transfer the deformations in these areas from the template to the target object. The procedure is continued until all the expressions in the database are calculated and transferred to the target 3D human face object. Additionally the system interacts with the user using an artificial intelligence (AI) chatter bot to generate response from a given text. Speech and facial animation are synchronized using the Microsoft Speech API, where the response from the AI bot is converted to speech.Version
No full-text available in the repositoryCitation
Athanasopoulos M, Ugail H and Gonzalez Castro G (2010) On the development of an Interactive talking head system. In: Proceedings of International Conference on Cyberworlds, Singapore, 20-22 Oct, 2010: 414-420.Link to Version of Record
https://doi.org/10.1109/CW.2010.53Type
Conference paperae974a485f413a2113503eed53cd6c53
https://doi.org/10.1109/CW.2010.53