Date of Publication
2024
Document Type
Dissertation/Thesis
Degree Name
Master of Science in Computer Science
College
College of Computer Studies
Department/Unit
Software Technology
Thesis Advisor
Judith J. Azcarraga, Ph.D.
Defense Panel Chair
Joel P. Ilao, Ph.D.
Defense Panel Member
Jocelynn W. Cu
Abstract (English)
This study investigates the spatial and temporal characteristics of facial expressions to classify academic emotions—boredom, confusion, engagement, and frustration—using the DAiSEE dataset. Two frame selection methods, Targeted and Changepoint, extracted three frames from each 10-second video, and deltas (Euclidean distance, cosine similarity) captured temporal changes in facial landmarks. Six feature selection techniques, including insights from teacher interviews, were tested with five machine learning algorithms: K-Nearest Neighbor (KNN), Decision Tree (DT), Multilayer Perceptron (MLP), Convolutional Neural Network (CNN), and Support Vector Machine (SVM). The findings suggest that Changepoint frame selection has a slight edge in classifying emotions involving dynamic expressions, such as engagement and frustration, whereas the Targeted method performs well across all emotions, especially more static ones like boredom and confusion. Interestingly, models trained on features from only the left side of the face performed comparably to those using features from the entire face, underscoring the relevance of specific facial regions in emotion classification. Additionally, these experiments validated teachers’ insights regarding which parts of the face typically reflect academic emotions. Whole feature selection generally excelled in boredom and engagement classification, while Teacher and Left feature selections were most effective for confusion and frustration, respectively. KNN and CNN consistently outperformed other algorithms, with KNN being most effective for boredom and CNN for confusion, engagement, and frustration. These insights underline the potential of integrating temporal dynamics and feature selection to improve academic emotion classification accuracy.
Abstract Format
html
Language
English
Recommended Citation
Resurreccion, P. H. (2024). Classifying Academic Emotions Using Changes in Facial Landmark Points over Time. Retrieved from https://animorepository.dlsu.edu.ph/etdm_softtech/16
Upload Full Text
wf_yes
Preliminary Pages
2024_Resurreccion_Chapter1.pdf (73 kB)
Chapter 1
2024_Resurreccion_Chapter2.pdf (4080 kB)
Chapter 2
2024_Resurreccion_Chapter3.pdf (2474 kB)
Chapter 3
2024_Resurreccion_Chapter4.pdf (1580 kB)
Chapter 4
2024_Resurreccion_Chapter5.pdf (102 kB)
Chapter 5
2024_Resurreccion_AppendixA.pdf (2594 kB)
Appendix A
2024_Resurreccion_AppendixB.pdf (76 kB)
Appendix B
2024_Resurreccion_References.pdf (103 kB)
References
Embargo Period
12-14-2024