Details
Presenter(s)
Display Name
Yufei Gao
- Affiliation
-
AffiliationSouthwest University of Science and Technology
- Country
Abstract
In this paper, we propose a learning framework based on the GAN network and multi-feature fusion strategy to learn the association between music and dance. It realizes the mapping from music to dance. The network integrates the style, beat, and structural characteristics of music, and uses two discriminators to make constraints from both style and coherence. The experimental quantitative and qualitative results show that our method can generate real, consistent style and beat-matching dance movements from music. In addition, we also collected and produced a data set containing music features and corresponding pose sequences, which is convenient for pose generation based on music features.