Documented by: Keyin Wu

Group Project with: Chenhe Gu, Ziying Wang

Description

This is an immersive music game which, unlike traditional ones, allows the user to experience alternative renditions of pop songs and perceive real-time modulations to the music they hear as a result of their actions. Specifically, notes are mapped from Deep Learning-generated melody sequences to blocks which the player hits along with the music. The perceived music will also be altered if the hit direction is incorrect.

#VirtualReality #ComputerMusic #GameDesign #MusicInformationRetrieval #MusicSynthesis

Melody Saber

Background

Melody Saber is an immersive music game inspired by the popular rhythm game Beat Saber. Different from Beat Saber which plays out the entire music piece regardless of the player’s performance, our Melody Saber allows the user to generate a customized game map of pop songs and play different melody clips depending on how they hit the game block. Via Melody Saber, we intend to complete the circle of Music Information Retrieval (MIR) and music synthesis.

Toolkits

Python (pretty_midi), Unity3D, C#, Oculus, Jupyter Notebook

Approach

We generate completely new accompaniment arrangements from the main melody of famous pop songs and map the results to playable game objects. We further divide our process into three sections as follows:

This is manifested in our input processing workflow (Fig. 1):

Figure 1. System Summary

Figure 1. System Summary

1. Accompaniment Generation