GetEmotion: A aata collection tool for building an affective haptic dataset

College

College of Computer Studies

Document Type

Conference Proceeding

Source Title

IEEE 48th Annual Computers, Software, and Applications Conference (COMPSAC)

First Page

2044

Last Page

2053

Publication Date

2024

Abstract

Interpersonal communication between humans is driven by different emotions expressed by various communication channels, including touch. Touch is a fundamental aspect of social interaction that has connected people of all ages to different concepts and mechanisms in the physical world and directly influences emotional expression and recognition through intimacy. The use of digital communication, however, has reduced physical interactions and has shifted the recognition of emotions via other modalities, such as face and voice, collected using cameras. However, these modalities can only sometimes support the prevalence of smartphone use. The face may not always be captured, nor the excellent quality audio collected for mobile applications to recognize emotions. This work introduces an alternative modality - haptic touch for emotion recognition. This paper presents haptic touch data collected using a mobile application called GetEmotion. It collects the responses of twenty (20) participants as they view clips from the LIRIS-ACCEDE dataset. Following the theory of James Russell’s Circumplex Model of Affect, features from the collected data were pre- processed and analyzed for correlation and significance to the valance and arousal values of the video clips. Results show that three features have a significant relation to the valence values, namely, pressure, start, and end coordinates along the x-axis. These findings suggest that the location of touch interaction on the screen and the intensity of touch indicate emotional states. More so, arousal levels have been found to significantly correlate with diverse features such as duration, touch count, distance, and various velocity and acceleration values. The findings indicate that both the nature and intensity of touch activities can reflect users’ emotional responses to video stimuli.

html

Digitial Object Identifier (DOI)

10.1109/COMPSAC61105.2024.00327

Disciplines

Computer Sciences

Keywords

Touch; Data sets

Upload File

wf_no

This document is currently not available here.

Share

COinS