4th International Workshop on Musical Metacreation (MUME 2016)
Held at the Seventh International Conference on Computational Creativity, ICCC 2016
1st of May DEADLINE EXTENDED: 10th of May
Reviewers deadline: 1st of June
Notification: 5th of June
15th of June DEADLINE EXTENDED: 18th of June
Workshop Day: 27th of June
We are delighted to announce the 4th International Workshop on Musical Metacreation (MUME 2016) to be held June 27, 2016, in conjunction with the Seventh International Conference on Computational Creativity, ICCC 2016. MUME 2016 builds on the enthusiastic response and participation we received for the past occurrences of MUME series:
- MUME 2012 (held in conjunction with AIIDE 2012 Stanford): https://musicalmetacreation.org/index.php/mume-2012/
- MUME 2013 (held in conjunction with AIIDE 2013 at NorthEastern): https://musicalmetacreation.org/index.php/mume-2013/
- MUME 2014 (held in conjunction with AIIDE 2014 at North Carolina): https://musicalmetacreation.org/index.php/mume-2014/
Metacreation involves using tools and techniques from artificial intelligence, artificial life, and machine learning, themselves often inspired by cognitive and life sciences, for creative tasks. Musical Metacreation explores the design and use of these tools for music making: discovery and exploration of novel musical styles and content, collaboration between human performers and creative software “partners”, and design of systems in gaming and entertainment that dynamically generate or modify music.
MUME aims to bring together artists, practitioners, and researchers interested in developing systems that autonomously (or interactively) recognize, learn, represent, compose, generate, complete, accompany, or interpret music. As such, we welcome contributions to the theory or practice of generative music systems and their applications in new media, digital art, and entertainment at large.
Our Motivation for Initiating the MUME Series
We have observed a strong and sustained growth of the field of generative music and more generally Musical Metacreation. Until this point, the work has been presented across a range of venues in related fields, including the International Computer Music Conference (ICMC), the International Conference on Computational Creativity (ICCC), Sound and Music Computing (SMC), AudioMostly, EvoMusArt, Generative Art, the symposium of the International Society for Music Information Retrieval (ISMIR) and other AI- and entertainment computing or computer-music conferences. We felt it was time to gather experts and specialists in a more focused arena, to define, explore and push forward the boundaries of Musical Metacreation. With MUME2016 we continue the vision of MUME as an ongoing series.
We encourage paper and demo submissions on MUME-related topics, including the following:
- Models, Representation and Algorithms for MUME
- Novel representations of musical information
- Advances or applications of AI, machine learning, and statistical techniques for generative music
- Advances of A-Life, evolutionary computing or agent and multi-agent based systems for generative music
- Computational models of human musical creativity
- Systems and Applications of MUME
- Systems for autonomous or interactive music composition
- Systems for automatic generation of expressive musical interpretation
- Systems for learning or modeling music style and structure
- Systems for intelligently remixing or recombining musical material
- Online musical systems (i.e. systems with a real-time element)
- Adaptive and generative music in video games
- Techniques and systems for supporting human musical creativity
- Emerging musical styles and approaches to music production and performance involving the use of AI systems
- Applications of musical metacreation for digital entertainment: sound design, soundtracks, interactive art, etc.
- Evaluation of MUME
- Methodologies for qualitative or quantitative evaluation of MUME systems
- Studies reporting on the evaluation of MUME
- Socio-economical Impact of MUME
- Philosophical implication of MUME
- Authorship and legal implications of MUME
Submission Format and Requirements
Please make submissions via the EasyChair system at:
The workshop is a full day event that includes:
- Presentations of FULL TECHNICAL PAPERS (8 pages maximum)
- Presentations of POSITION PAPERS and WORK-IN-PROGRESS PAPERS (5 pages maximum)
- Presentations of DEMONSTRATIONS (3 pages maximum)
All papers should be submitted as complete works. Demo systems should be tested and working by the time of submission, rather than be speculative. We encourage audio and video material to accompany and illustrate the papers (especially for demos). We ask that authors arrange for their web hosting of audio and video files, and give URL links to all such files within the text of the submitted paper.
Submissions do not have to be anonymized, as we use single-blind reviewing. Each submission will be reviewed by at least three program committee members.
Workshop papers will be published as MUME-2016 Proceedings and will be archived with an ISBN number. Submissions should be formatted using the AAAI, 2-column format; see instructions and templates here:
We created a new MUME 2016 template based on AAAI template. The MUME 2016 latex and Word template is available at:
Submission should be uploaded using MUME 2016 EasyChair portal:
Presentation and Multimedia Equipment:
We will provide a video projection system as well as a stereo audio system for use by presenters at the venue. Additional equipment required for presentations and demonstrations should be supplied by the presenters. Contact the Workshop Chair to discuss any special equipment and setup needs/concerns.
It is expected that at least one author of each accepted submission will attend the workshop to present their contribution. We also welcome those who would like to attend the workshop without presenting. Workshop registration will be available through the ICCC2016 conference system.
Questions & Requests
Please direct any inquiries/suggestions/special requests to the Workshop Chair, Philippe Pasquier.
Dr. Philippe Pasquier
Associate Professor, School of Interactive Arts + Technology (SIAT),
Simon Fraser University
Email Dr. Pasquier, or visit his website.
Dr. Arne Eigenfeldt
Professor, The School for the Contemporary Arts,
Simon Fraser University
Email Dr. Eigenfeldt, or visit his website.
Dr. Oliver Bown
Senior Lecturer, Art and Design,
University of New South Wales
Email Dr. Bown, or visit his website.
Administration & Publicity Assistant
PhD Student, School of Interactive Arts + Technology (SIAT),
Simon Fraser University
Email Kıvanç or visit his website
- Robert Keller – Harvey Mudd College (USA)
- George Lewis – University of Columbia
- Eduardo Miranda – University of Plymouth (UK)
- Diemo Schwarz – Ircam – CNRS STMS (France)
- Andrew Sorensen – QUT (Australia)
- Darrell Conklin – University of the Basque Country (Basque Country)
- Roger Dannenberg – Carnegie Mellon University (USA)
- Benjamin Smith – University of Illinois at Urbana-Champaign (USA)
- David Cope – UCSC (USA)
- Evan Merz – UCSC (USA)
- Tim Blackwell – Goldsmiths, University of London (UK)
- Jason Freeman – Georgia Institute of Technology (USA)
- Dan Ventura – Brigham Young University (USA)
- Bill Manaris – College of Charleston (USA)
- James McDermott – University College Dublin (Ireland)
- Robert Rowe – New York University (USA)
- Matthew Yee-King – Goldsmiths, University of London (UK)
- Michael Casey – Dartmouth College
- Jianyu Fan – School of Interactive Arts and Technology
- Miles Thorogood – School of Interactive Arts and Technology
- Tristan Bayfield – School of Interactive Arts and Technology
- Johan Lilius – bo Akademi University
- Steven Jan – University of Huddersfield
- Anna Jordanous – University of Kent
- Mark Sandler – Queen Mary University of London
- Andrew Brown – Griffith University
- Frank Dufour – The University of Texas at Dallas
- Guillaume Belson – LIRIS
- Clifton Callender – Florida State University
- Ian Whalley – University of Waikato
- Doug Van Nort – York University
You must be logged in to post a comment.