MUME 2018 Concert

The Musical Metacreation Concert at the Ninth International Conference on Computational Creativity, Salamanca, Spain, June 2018


MUME 2018 Concert will start at 8pm on June 25th. Please stay tuned for the location of the concert and the details of the program.

The show will have three performances:

1- New Songs by ALYSIA: Pop, Rap & CC Anthem – Maya Ackerman
ALYSIA is a co-creative songwriting system that makes it easy to create original vocal melodies ( The focus of this set is on modern pop music with some rap influences made in collaboration with this system: “Believe In Us” and “Beautiful Memory”. Lastly, we used ALYISA to create a Computational Creativity Anthem. You’re invited to sing along!
CC Anthem (lyrics)
Computational Creativity
The final frontier
Co-creativity, robot creativity
What is creativity anyway?
Not just generation,
Don’t forget evaluation
Evaluation is king!
Computational Creativity
The final frontier


Maya Ackerman is an Assistant Professor at Santa Clara University, specializing in Computational Creativity and Machine Learning. She is also a semi-professional singer with extensive vocal training. Maya is a co-creator of ALYSIA, and frequently performs songs made with this system.

David Loker studied at the University of Waterloo and worked as a Data Scientist at Neflix and He is now the CTO of WaveAI Inc. David is a co-creator of ALYSIA.

Christopher Cassion is a recent graduate from Georgia Tech Institute of Technology, where he completed his Masters in Computer Science. Christopher has also previously worked at IBM Research. He is a co-creator of ALYSIA.

Dusti Miraglia is a producer and sound designer with over 15 years of industry experience. Dusti produces the background tracks for ALYSIA’s compositions.

2- Four pieces by Paul Bodily and Dan Ventura
The Night Sky
This is the Way
And I Think

It is Lost (Be Found)

Pop* (pronounced Pop-Star), featured in the 2017 MuMe concert, is an automated pop/rock/show tunes lead sheet composer. It uses a modular framework to learn and generate verse-chorus structure, rhyme-scheme, lyrics, harmony, and melody. Pop* creates novel pop songs in lead sheet format using pop lead sheets and song lyrics. Pop* creates both printed sheet music and audio recordings featuring its own computer-sung lyrics.
Pop* has three interests: being in love, feeling depressed, and new beginnings. Pop* searches for tweets related to its interests. From these tweets it chooses a tweet that makes it feel something. Pop* uses this feeling to formulate an intention: a theme that the system will communicate through music.
Pop* searches for existing lyrics and sheet music that are related to its intention. Pop* uses the lyrics and sheet music to learn patterns of chords, rhythm, pitch, lyrics, and structural motifs. It uses this learning to generate multiple compositions. Pop* evaluates each composition and chooses one that best reflects the system’s intention and has the catchiest music.
Paul Bodily is a PhD candidate in the Computer Science (CS) department at Brigham Young University (BYU) in Provo, Utah. Under the advisement of Dr. Dan Ventura, his research focuses on machine learning for inspired, structured, lyrical music composition. Paul’s other computational creativity interests include recipe generation (PIERRE), creative text transformation (Lyrist), and six-word stories (MICROS).
Dr. Dan Ventura is a CS professor at BYU whose focus is on computational creativity systems generally. Students under his advisement have published systems in domains such as artistic image generation (DARCI), recipe generation (PIERRE), jazz lead sheet composition (CARL), and neology (Nehovah).
3- Maybe Arrive – Harun Gezici and Jay Hardesty
Anticipation and arrival, or lack of arrival, nested within other anticipations and outcomes, forms a self-similar landscape of rhythmic building blocks. In this piece, bass lines and melodies are parsed into these rhythmic building blocks and then regenerated from nearby points on that landscape, steered by the performer in real time. The geometry of the GUI affords navigation and shaping of music at a relatively subjective level, via simple, nested operations on patterns of rhythmic expectation.
These algorithms are implemented by custom macOS apps that analyze and generate MIDI note patterns in Ableton Live. One app directly manipulates individual Live clips and the other app morphs between selected Live clips. Remaining elements of the Live set are also under the control of the performer but unaffected by the algorithms.
The composer and performer of the piece is Harun Gezici. He has been producing different styles of electronic music for more than 15 years as LowNoiz, and he is founder of netlabel Rauscharm Recordings at His music can be found at and He resides in Ennetbaden, Switzerland.
The author of the software is Jay Hardesty. He resides in Zurich, Switzerland. Background and recent projects, including the algorithms used for this performance, are detailed at

Workshop Organizers

Dr. Oliver Bown
Interactive Media Lab
UNSW, Australia

Pr. Arne Eigenfeldt
School for the Contemporary Arts
Simon Fraser University, Canada

Pr. Philippe Pasquier
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Canada

Kıvanç Tatar
School of Interactive Arts and Technology,
Simon Fraser University, Vancouver, Canada.

MUME Steering Committee

Andrew Brown – Griffith University, Australia

Anna Jordanous – University of Kent, UK

Bob Keller – Harvey Mudd College, US

Róisín Loughran – University College Dublin, Ireland

Michael Casey – Dartmouth College, US

Benjamin Smith – Purdue University Indianapolis, US