Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/137835
Title: Using computer vision to analyse emotions in a two-person interview
Authors: Vassallo Pulis, Matthias (2025)
Keywords: Emotion recognition -- Malta
Computer vision -- Malta
Natural language processing (Computer science)
Machine learning
Issue Date: 2025
Citation: Vassallo Pulis, M. (2025). Using computer vision to analyse emotions in a two-person interview (Bachelor's dissertation).
Abstract: In recent years, as technology becomes more evolved, emotion recognition has gained prominence across fields such as human resources, psychology, and marketing. However, interpreting emotions in media content remains challenging due to their inherent subjectivity. This project aims to address this issue by developing a tool that can automatically analyse emotions exhibited in two-person interviews. The proposed framework utilises various computer vision, natural language processing, and machine learning techniques to extract key information regarding individuals and their emotions from interview videos. Some examples of these techniques include facial detection and recognition, facial attribute analysis, and Named-Entity Recognition (NER), which was mainly used to identify the names of the individuals present throughout a given interview video. The main objective of this project is to explore effective methods for analysing emotions exhibited in two-person interviews, following which a system is developed that can identify the individuals that appear during the interview video and predict the emotions that both individuals convey throughout the video. This process involved a dataset containing a total of 24 videos, all of which feature differences in lighting, occlusion and face orientation. Once this process is completed, users are provided with an easy-to-understand analysis of this information in the form of graph visualisations. These delve into various areas, including the distribution of emotions by both the interviewer and guest, along with the emotional changes that take place throughout the video. The results showed that the developed system is able to track individuals and their emotions with quite a high performance, achieving a 98.7% accuracy for facial recognition, along with a modest performance for emotion recognition, including an accuracy of 36%. which was the result of the subjectivity bias present in the annotations gathered during the evaluation process on a limited subset of the collected videos. While the approach integrated in this project to two-person emotion analysis is certainly promising, further work can be made in certain areas, especially in improving accuracy and increasing the variety of sources that can be used.
Description: B.Sc. (Hons) ICT(Melit.)
URI: https://www.um.edu.mt/library/oar/handle/123456789/137835
Appears in Collections:Dissertations - FacICT - 2025
Dissertations - FacICTAI - 2025

Files in This Item:
File Description SizeFormat 
2508ICTICT390900017272_1.PDF
  Restricted Access
16.72 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.