Meldy: a music generator

Matteo Bernardini, Yilin Zhu
{10743181,10702368}@mail.polimi.it

ACTAM Project 2019-2020

Overview of the Project

Project Structure

Background

  • Computer music
  • Melody generator
  • Computational creativity
 

Resources

  • music21: computer-aided musicology toolkit
  • p5.js: web visualization
  • OSMD: MusicXML rendering
  • webpack: development

1) User Input

  • 2D picker: strength of a mood
    • implemented using p5.js

 

  • Dimensional approach from MER
    • x-axis: valence, in range [0, 1]
    • y-axis: arousal, in range [0, 1]

Choose a mood as input

2) Melody Generation

2.1 Methods used in this part

  • Grammar-based method
    • relative degree
    • note duration
  • Mapping from "music mood" to the musical features

2) Melody Generation

2.1 Output format

  • MusicXML: aimed to music representation, better suited than MIDI
  • Natively supported by music21

2) Melody Generation

  • First attempt: music21j
    • not mature enough, several bugs
       
  • Conclusion: music21 (python version)
    • back-end needed for this step

2.2 Musicology Tools: Music21

2) Melody Generation

2.3 Development Pipeline

3) Music Rendering

  • User clicks the Impress Me button → view switch
  • Rendering MusicXML to Music Notation
    • OpenSheetMusicDisplay: TypeScript Library
    • output as SVG, rendered natively by browser

3.1 Show Sheet Music

3) Music Rendering

  • Play → listen to the generated music
  • Download → save MusicXML (open with Finale, Sybelius, ...)
  • Restart → go back to the mood selection view

3.2 Play and listen to it

Future Works

  • Enhance grammar model accuracy
  • From single melodic paragraph to multi-paragraphs
  • Introduce different musical forms (e.g. sonata)
  • Other music elements (e.g. different time signature)

How can we improve this project

Reference & Links

Thanks for your attention!