Creating and annotating affect databases from face and body display: A contemporary survey
- Publication Type:
- Conference Proceeding
- Citation:
- Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, 2006, 3 pp. 2426 - 2433
- Issue Date:
- 2006-01-01
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Databases containing representative samples of human multi-modal expressive behavior are needed for the development of affect recognition systems. However, at present publicly-available databases exist mainly for single expressive modalities such as facial expressions, static and dynamic hand postures, and dynamic hand gestures. Only recently, a first bimodal affect database consisting of expressive face and upperbody display has been released. To foster development of affect recognition systems, this paper presents a comprehensive survey of the current state-of-the art in affect database creation from face and body display and elicits the requirements of an ideal multi-modal affect database. © 2006 IEEE.
Please use this identifier to cite or link to this item: