IEMOCAP
The IEMOCAP, or Interactive Emotional Dyadic Motion Capture database, is a widely used dataset in affective computing and speech emotion recognition research. It was created by researchers at the University of Southern California and comprises audio-visual recordings of 10 actors performing 10 different scripted scenarios and 10 improvisational scenarios. The goal of the dataset was to capture a range of emotional expressions in a dyadic (two-person) interaction context.
The recordings were made with high-quality motion capture technology, allowing for the capture of detailed facial
IEMOCAP has been instrumental in advancing research in areas such as automatic emotion recognition from speech