Singapore Institute of Technology
Browse

Designing a Usable Framework for Diverse Users in Synthetic Human Action Data Generation

Download (4.95 MB)
conference contribution
posted on 2025-01-10, 04:53 authored by Megani Rajendran, Chek Tien Tan, Indriyati AtmosukartoIndriyati Atmosukarto, Aik Beng Ng, Joey Lim, Triston Chan Sheen, Simon See

This paper introduces SynthDa2, a synthetic data generation framework aimed at addressing data scarcity for training video-based machine learning models. Grounded in an initial user study (n=84), SynthDa2 includes both an API and a UI to accommodate technical and non-technical users. The framework leverages generative AI techniques and domain randomization to create diverse, user-customized synthetic video datasets. Case study experiments on dataset permutations demonstrate the feasibility of SynthDa2 in assessing composition impacts. While the initial user study confirmed camera angles as a popular key variation factor, experiments reveal they are insufficient alone for desired outcomes, highlighting the need for further research. A subsequent hands-on user study (n=8) further validates SynthDa2’s usability across varied users.

History

Journal/Conference/Book title

SA '24: SIGGRAPH Asia 2024 Technical Communications Tokyo Japan December 3 - 6, 2024

Publication date

2024-12-03

Version

  • Published

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC