Call for Participation

2nd International Workshop on Musical Metacreation (MUME 2013)

Held at the Ninth AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE’13)

Northeastern University, Boston, Massachusetts, October 14-15, 2013

Introduction

We are delighted to announce the 2nd International Workshop on Musical Metacreation (MUME2013) to be held October 14 and 15, 2013, in conjunction with the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE’13). MUME2013 builds on the enthusiastic response and participation we received for the inaugural workshop in 2012 — this year the workshop has expanded to 2 days.

Thanks to continued progress in artistic and scientific research, a new possibility has emerged in our musical relationship with technology: Generative Music or Musical Metacreation, the design and use of computer music systems which are “creative on their own”. Metacreation involves using tools and techniques from artificial intelligence, artificial life, and machine learning, themselves often inspired by cognitive and life sciences. Musical Metacreation suggests exciting new opportunities to enter creative music making: discovery and exploration of novel musical styles and content, collaboration between human performers and creative software “partners”, and design of systems in gaming and entertainment that dynamically generate or modify music.

MUME brings together artists, practitioners and researchers interested in developing systems that autonomously (or interactively) recognize, learn, represent, compose, complete, accompany, or interpret music. As such, we welcome contributions to the theory or practice of generative music systems and their applications in new media, digital art, and entertainment at large. Join us at MUME2013 and take part in the exciting spirit of this growing community!

Our Motivation for Initiating MUME 2013

We have observed a strong and sustained growth of the field of generative music and more generally Musical Metacreation. Until this point, the work has been presented across a range of venues in related fields, including the International Computer Music Conference (ICMC), the International Conference on Computational Creativity (ICCC), Sound and Music Computing (SMC), EvoMusArt, Generative Art, the symposium of the International Society for Music Information Retrieval (ISMIR) and other AI- and entertainment computing or computer-music conferences. We felt it was time to gather experts and specialists in a more focused arena, to define, explore and push forward the boundaries of Musical Metacreation. Our inaugural workshop last year was met with enthusiasm and received 31 submissions with a 55% acceptance rate. With MUME2013 we continue the vision of MUME as an ongoing series.

Format

  • The workshop will be a two day event including:
  • Presentations of FULL TECHNICAL PAPERS (8 pages maximum)
  • Presentations of POSITION PAPERS and TECHNICAL IN-PROGRESS WORK (5 pages maximum)
  • Presentations of DEMONSTRATIONS
  • One or more PANEL SESSIONS (potential topics include international collaborations, evaluation methodologies, industry engagement, generative music in art vs. games)
  • Presentations by INDUSTRY PARTNERS on Musical Metacreation-related work and challenges

Presentations (technical, position and demo) will be 20 minutes long for questions and answers. Also, reviewers will be asked to propose one or two critical but general questions related to the submission that could be asked to the authors after their presentation to stimulate discussion.

Topics

  • We encourage paper and demo submissions on topics including the following:
  • Novel representations of musical information
  • Systems for autonomous or interactive music composition
  • Systems for automatic generation of expressive musical interpretation
  • Systems for learning or modelling music style and structure; exploring or transforming a musical space
  • Systems for intelligently remixing or recombining musical material
  • Advances or applications of artificial intelligence, machine learning, and statistical techniques for musical purposes
  • Advances or applications of evolutionary computing or agent and multiagent-based systems for musical purposes
  • Computational models of human musical creativity
  • Techniques and systems for supporting human musical creativity; intelligent agents that support the user in being more creative musically
  • Online musical systems (i.e. systems with a real-time element)
  • Adaptive music in video games
  • Methodologies for and studies reporting on evaluation of musical metacreations
  • Emerging musical styles and approaches to music production and performance involving the use of AI systems
  • Applications of Musical Metacreation for digital entertainment: sound design, soundtracks, video games, interactive art, etc.

Attendance

Academics and artists interested in presenting at the workshop are asked to submit complete papers for review. Demo presentations will be evaluated based on submission of a shorter (maximum 3 page) paper describing the system to be demonstrated. Submissions will be reviewed by an international program committee of experts.

Academics and artists interested in participating in a panel discussion are invited to please contact the workshop chair (listed below).

We also welcome those who would like to attend the workshop without presenting. Workshop registration will be available through the AIIDE’13 conference system.

Submission Requirements

Please make submissions via the EasyChair system at:
https://www.easychair.org/conferences/?conf=mume2013.

All papers should be submitted as completed works. Demo systems should be tested and working by the time of submission, rather than speculative. We encourage audio and video material to accompany and illustrate the papers (especially for demos). We ask that authors arrange for their own web hosting of audio and video files, and give URL links to all such files within the text of the submitted paper.

Paper length requirements are flexible, given as maximum limits only. For demo submissions especially, authors may prefer shorter but high-quality submissions which clearly explain their systems or findings.

  • Length requirements:
  • Max 8 pages for technical papers
  • Max 5 pages for position papers
  • Max 3 pages for demo papers
Formatting:

Papers should be prepared using Word or LaTeX and submitted as PDF files. Papers should be prepared in the same AAAI format as will eventually be required for the camera-ready copy. The AAAI formatting instructions, as well as Word template and LaTeX macro files, can be found here: www.aaai.org/Publications/Author/author.php

Submissions do not have to be anonymized.

AAAI will compile the accepted workshop papers into an AAAI technical report — an informal publication allowing materials to be quickly available to a wider audience after the workshop. There will be opportunity to revise accepted papers, based on reviewer comments, before the final document submission to AAAI.

Presentation and Multimedia Equipment:

We plan to provide two video projection systems as well as a stereo audio system for use by presenters at the venue. Additional equipment required for presentations and demonstrations should be supplied by the presenters. Contact the Workshop Chair to discuss any special equipment and setup needs/concerns.

Industry Involvement

We invite companies and businesses involved in Musical Metacreation and its application to come present their work and challenges to the MUME community. Each industrial partner selected will be given a timeslot to present/demo during the workshop. Interested industry representatives, for more information please see: www.metacreation.net/mume2013/industry

Questions & Requests

Please direct any inquiries/suggestions/special requests to the Workshop Chair,
Philippe Pasquier.

Submit

Submit Papers (via EasyChair) at:
https://www.easychair.org/conferences/?conf=mume2013

Important Dates

  • Submission deadline: July 1, 2013
  • Notification date: August 6, 2013
  • Accepted author CRC due to AAAI Press: August 14, 2013
  • Workshop date: October 14-15, 2013

Workshop Organisers

Workshop Chair

Dr. Philippe Pasquier Dr. Philippe Pasquier
Assistant Professor, School of Interactive Arts + Technology (SIAT),
Simon Fraser University
Email Dr. Pasquier, or visit his website.

Workshop Committee

Dr. Arne Eigenfeldt Dr. Arne Eigenfeldt
Associate Professor, The School for the Contemporary Arts,
Simon Fraser University
Email Dr. Eigenfeldt, or visit his website.

Dr. Oliver Bown Dr. Oliver Bown
Lecturer, in the Design Lab, Faculty of Architecture, Design and Planning,
The University of Sydney
Email Dr. Bown.

Administration & Publicity Assistant

Graeme McCaig Graeme McCaig
PhD Student, School of Interactive Arts + Technology (SIAT),
Simon Fraser University
Email Graeme.

Program Committee

Gerard Assayag – IRCAM-France
Al Biles – Rochester Institute of Technology – USA
Tim Blackwell – Department of Computing, Goldsmiths College, University of London – UK
Alan Blackwell – Cambridge University – UK
Oliver Bown – The University of Sydney – Australia
Andrew Brown – Queensland Conservatorium, Griffith University – Australia
Jamie Bullock – Integra Lab, Birmingham Conservatoire – UK
Karen Collins – University of Waterloo – Canada
Nick Collins – University of Sussex – UK
Darrell Conklin – University of the Basque Country – Spain
Arne Eigenfeldt – Simon Fraser University – Canada
Jason Freeman – Georgia Institute of Technology – USA
Guy Garnett – University of Illinois – USA
Toby Gifford – Griffith University – Australia
Luke Harrald – Elder Conservatorium of Music, The University of Adelaide – Australia
Bill Hsu – Department of Computer Science, San Francisco State University – USA
Robert Keller – Harvey Mudd College – USA
Nyssim Lefford – Audio Technology, Luleå University of Technology – Sweden
George Lewis – Department of Music, Columbia University – USA
Aengus Martin – Faculty of Engineering, The University of Sydney – Australia
James Maxwell – Simon Fraser University – Canada
Graeme McCaig – School of Interactive Arts and Technology, Simon Fraser University – Canada
Jon McCormack – Centre for Electronic Media Art, Monash University – Australia
James Mcdermott – Complex and Adaptive Systems Laboratory, University College Dublin – Ireland
Alex Mclean – ICSRiM – University of Leeds – UK
Kia Ng – ICSRiM – University of Leeds – UK
Philippe Pasquier – School of Interactive Arts and Technology, Simon Fraser University – Canada
Marcus Pearce – Queen Mary, University of London – UK
Robert Rowe – New York University – USA
Benjamin Smith – Case Western Reserve University – USA
Richard Stevens – Leeds Metropolitan University – UK
Michael Sweet – Berklee College of Music – USA
Peter Todd – Indiana University – USA
Dan Ventura – Brigham Young University – USA
Ivan Zavada – Conservatorium of Music, The University of Sydney – Australia