Designing a Usable Framework for Diverse Users in Synthetic Human Action Data Generation
This paper introduces SynthDa2, a synthetic data generation framework aimed at addressing data scarcity for training video-based machine learning models. Grounded in an initial user study (n=84), SynthDa2 includes both an API and a UI to accommodate technical and non-technical users. The framework leverages generative AI techniques and domain randomization to create diverse, user-customized synthetic video datasets. Case study experiments on dataset permutations demonstrate the feasibility of SynthDa2 in assessing composition impacts. While the initial user study confirmed camera angles as a popular key variation factor, experiments reveal they are insufficient alone for desired outcomes, highlighting the need for further research. A subsequent hands-on user study (n=8) further validates SynthDa2’s usability across varied users.
History
Journal/Conference/Book title
SA '24: SIGGRAPH Asia 2024 Technical Communications Tokyo Japan December 3 - 6, 2024Publication date
2024-12-03Version
- Published