Week 5— Generating Music by using Deep Learning

b21626972
BBM406 Spring 2021 Projects
2 min readMay 16, 2021

--

Hi, in previous week, we discussed the Transformer based model on Music Generation. This week, we will examine an implementation of the architecture.

Executing Code

To visualize the results of the Music Transformers model, we have executed a code provided in Github that is related to the paper we previously mentioned. To train the model, we used piano MIDI files. Dataset is splitted into Train(%90) and Validation(%10) sets. For this experiment, we used a small dataset which is 100 MIDI files. We run the code on Google Colab after setting dataset, batch size and epoch size. In this experiment, 5 epochs took 80 minutes. After 80 minutes our model was ready for test. Below, we included our plot for training loss.

After, we tested validation dataset. Below plot is our validation loss.

We believe these model and results will lead us in our project.

--

--