Singapore Institute of Technology
Browse

File(s) not publicly available

Reason: File will be uploaded after conference proceedings have been published.

Dynamic Automatic Chiptune Generation for Game Music

conference contribution
posted on 2024-10-04, 22:13 authored by Valerie En Qi Leo, Seri Hanzalah Haniffah, Xuanting Teo, Bryon Yue Xiao Tian, Nicholas Heng Loong WongNicholas Heng Loong Wong

This project investigates how Artificial Intelligence can be used to generate chiptune music that changes based on emotional cues in real-time. Utilising Long Short-Term Memory networks, the research aims to create emotionally impactful and musically unified chiptune music to increase players’ involvement in video games. The YM2413-MDB dataset which includes FM video game music from the 80s with detailed emotional annotations, acts as the model’s training data. Essential steps consist of preprocessing the Musical Instrument Digital Interface files into tokenized events, training the LSTM to recognise patterns and emotions and creating a user-friendly interface for user input. The proposed solution includes generating music sequences that mirror chosen emotions, converting the sequences into sound and evaluating the model’s success through automated and human evaluations. This study focuses on filling up the void in emotion-?based music creation in real-time with the goal of expanding interactive entertainment and enhancing game experiences using chiptune soundtracks that adapt to the gameplay.

History

Journal/Conference/Book title

4th International Conference on Robotics, Automation, and Artificial Intelligence (RAAI) 2024

Publication date

2024-12-19

Version

  • Pre-print

Corresponding author

nicholas.wong@singaporetech.edu.sg