Published January 1, 2020 | Version v1
Journal article Open

Affective synthesis and animation of arm gestures from speech prosody

  • 1. Koc Univ, Coll Engn, Multimedia Vis & Graph Lab, Istanbul 34450, Turkey

Description

In human-to-human communication, speech signals carry rich emotional cues that are further emphasized by affect-expressive gestures. In this regard, automatic synthesis and animation of gestures accompanying affective verbal communication can help to create more naturalistic virtual agents in human-computer interaction systems. Speech-driven gesture synthesis can map emotional cues of the speech signal to affect-expressive gestures by modeling complex variability and timing relationships of speech and gesture. In this paper, we investigate the use of continuous affect attributes, which are activation, valence and dominance, for speech-driven affective synthesis and animation of arm gestures. To this effect, we present a statistical framework based on hidden semi-Markov models (HSMM), where states are gestures and observations are speech-prosody and continuous affect attributes. The proposed framework is evaluated considering four distinct HSMM structures which differ by their emission distributions. Evaluations are performed over the USC CreativeIT database in a speaker-independent setup. Among the four statistical structures, the conditional structure, which models observation distributions as prosody given affect, achieves the best performance under both objective and subjective evaluations.

Files

bib-92083239-6e4a-4a75-ad3f-a37e653df75a.txt

Files (152 Bytes)

Name Size Download all
md5:4ab6e112d54c014c7f85f988183382da
152 Bytes Preview Download