Emotion-aware human attention prediction
College
College of Computer Studies
Department/Unit
Computer Technology
Document Type
Conference Proceeding
Source Title
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Volume
2019-June
First Page
4021
Last Page
4030
Publication Date
6-1-2019
Abstract
© 2019 IEEE. Despite the recent success in face recognition and object classification, in the field of human gaze prediction, computer models are still struggling to accurately mimic human attention. One main reason is that visual attention is a complex human behavior influenced by multiple factors, ranging from low-level features (e.g., color, contrast) to high-level human perception (e.g., objects interactions, object sentiment), making it difficult to model computationally. In this work, we investigate the relation between object sentiment and human attention. We first introduce a new evaluation metric (AttI) for measuring human attention that focuses on human fixation consensus. A series of empirical data analyses with AttI indicate that emotion-evoking objects receive attention favor, especially when they co-occur with emotionally-neutral objects, and this favor varies with different image complexity. Based on the empirical analyses, we design a deep neural network for human attention prediction which allows the attention bias on emotion-evoking objects to be encoded in its feature space. Experiments on two benchmark datasets demonstrate its superior performance, especially on metrics that evaluate relative importance of salient regions. This research provides the clearest picture to date on how object sentiments influence human attention, and it makes one of the first attempts to model this phenomenon computationally.
html
Digitial Object Identifier (DOI)
10.1109/CVPR.2019.00415
Recommended Citation
Cordel, M., Fan, S., Shen, Z., & Kankanhalli, M. (2019). Emotion-aware human attention prediction. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 4021-4030. https://doi.org/10.1109/CVPR.2019.00415
Upload File
wf_yes