BRADFORD SCHOLARS

    • Sign in
    View Item 
    •   Bradford Scholars
    • University of Bradford eTheses
    • Theses
    • View Item
    •   Bradford Scholars
    • University of Bradford eTheses
    • Theses
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of Bradford ScholarsCommunitiesAuthorsTitlesSubjectsPublication DateThis CollectionAuthorsTitlesSubjectsPublication Date

    My Account

    Sign in

    HELP

    Bradford Scholars FAQsCopyright Fact SheetPolicies Fact SheetDeposit Terms and ConditionsDigital Preservation Policy

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    The computational face for facial emotion analysis: Computer based emotion analysis from the face

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    View/Open
    al-dahoud, a.pdf (3.868Mb)
    Download
    Author
    Al-dahoud, Ahmad
    Supervisor
    Ugail, Hassan
    Keyword
    Facial emotion analysis
    Facial expressions
    Human-machine interaction
    Rights
    Creative Commons License
    The University of Bradford theses are licenced under a Creative Commons Licence.
    Institution
    University of Bradford
    Department
    Faculty of Engineering and Informatics
    Awarded
    2018
    
    Metadata
    Show full item record
    Abstract
    Facial expressions are considered to be the most revealing way of understanding the human psychological state during face-to-face communication. It is believed that a more natural interaction between humans and machines can be undertaken through the detailed understanding of the different facial expressions which imitate the manner by which humans communicate with each other. In this research, we study the different aspects of facial emotion detection, analysis and investigate possible hidden identity clues within the facial expressions. We study a deeper aspect of facial expressions whereby we try to identify gender and human identity - which can be considered as a form of emotional biometric - using only the dynamic characteristics of the smile expressions. Further, we present a statistical model for analysing the relationship between facial features and Duchenne (real) and non-Duchenne (posed) smiles. Thus, we identify that the expressions in the eyes contain discriminating features between Duchenne and non-Duchenne smiles. Our results indicate that facial expressions can be identified through facial movement analysis models where we get an accuracy rate of 86% for classifying the six universal facial expressions and 94% for classifying the common 18 facial action units. Further, we successfully identify the gender using only the dynamic characteristics of the smile expression whereby we obtain an 86% classification rate. Likewise, we present a framework to study the possibility of using the smile as a biometric whereby we show that the human smile is unique and stable.
    URI
    http://hdl.handle.net/10454/17384
    Type
    Thesis
    Qualification name
    PhD
    Collections
    Theses

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.