CFP International Conference on Computational Creativity

International Conference on Computational Creativity 2019
17-21 June 2019 – Charlotte, North Carolina, USA

Submit your papers/workshops/demo proposals for the International Conference on Computational Creativity 2019 to be held 17-21 June 2019 in Charlotte, NC!

(Alternatively, submit to the MuMe 2019 workshop!)

Computational creativity is the art, science, philosophy and engineering of computational systems which, by taking on particular responsibilities, exhibit behaviours that unbiased observers would deem to be creative. As a field of research, this area is thriving, with progress in formalising what it means for software to be creative, along with many exciting and valuable applications of creative software in the sciences, design, the arts, literature, gaming and elsewhere. The ICCC conference series organized by the Association for Computational Creativity since 2010 is the only scientific conference that focuses on computational creativity alone and also covers all aspects of it.

The conference programme will include paper presentations, workshops, poster sessions, doctoral consortium and an art/demo exhibition, along with prominent keynote speakers. All submitted papers will be reviewed by experts in the field based on the criteria of originality, significance, quality and clarity.

Important Dates:
Paper and Workshop proposal Submissions Due: 18 February 2019
Doctoral Consortium Submissions Due: 18 March 2019
Art/Demo Submissions Due: 17 May 2019
Acceptance Notification : 29 April 2019
Camera Ready Submission : 17 May 2019
Conference Dates : 17-21 June 2019

Complete details are available on the conference website


CFP: 7th International Workshop on Musical Metacreation


((( MUME 2019 )))
The 7th International Workshop on Musical Metacreation
June 17-18, 2019, Charlotte, North Carolina

MUME 2019 is to be held at the University of North Carolina Charlotte in conjunction with the 10th International Conference on Computational Creativity, ICCC 2019 (

=== Important Dates ===
Workshop submission deadline: February 24, 2019
Notification date: April 28, 2019
Camera-ready version: May 19, 2019
Workshop dates: June 17-18, 2019

Metacreation applies tools and techniques from artificial intelligence, artificial life, and machine learning, themselves often inspired by cognitive and natural science, for creative tasks. Musical Metacreation studies the design and use of these generative tools and theories for music making: discovery and exploration of novel musical styles and content, collaboration between human performers and creative software “partners”, and design of systems in gaming and entertainment that dynamically generate or modify music.

MUME intents to bring together artists, practitioners, and researchers interested in developing systems that autonomously (or interactively) recognize, learn, represent, compose, generate, complete, accompany, or interpret music. As such, we welcome contributions to the theory or practice of generative music systems and their applications in new media, digital art, and entertainment at large.

We encourage paper and demo submissions on MUME-related topics, including the following:
— Models, Representation and Algorithms for MUME
—- Novel representations of musical information
—- Advances or applications of AI, machine learning, and statistical techniques for generative music
—- Advances of A-Life, evolutionary computing or agent and multi-agent based systems for generative music
—- Computational models of human musical creativity
— Systems and Applications of MUME
—- Systems for autonomous or interactive music composition
—- Systems for automatic generation of expressive musical interpretation
—- Systems for learning or modeling music style and structure
—- Systems for intelligently remixing or recombining musical material
—- Online musical systems (i.e. systems with a real-time element)
—- Adaptive and generative music in video games
—- Generative systems in sound synthesis, or automatic synthesizer design
—- Techniques and systems for supporting human musical creativity
—- Emerging musical styles and approaches to music production and performance involving the use of AI systems
—- Applications of musical metacreation for digital entertainment: sound design, soundtracks, interactive art, etc.
— Evaluation of MUME
—- Methodologies for qualitative or quantitative evaluation of MUME systems
—- Studies reporting on the evaluation of MUME
—- Socio-economical Impact of MUME
—- Philosophical implication of MUME
—- Authorship and legal implications of MUME

Submission Format and Requirements
Please make submissions via the EasyChair system at:

The workshop is a day and a half event that includes:
-Presentations of FULL TECHNICAL PAPERS (8 pages maximum)
-Presentations of POSITION PAPERS and WORK-IN-PROGRESS PAPERS (5 pages maximum)
-Presentations of DEMONSTRATIONS (3 pages maximum) which present outputs of systems (working live or offline).

All papers should be submitted as complete works. Demo systems should be tested and working by the time of submission, rather than be speculative. We encourage audio and video material to accompany and illustrate the papers (especially for demos). We ask that authors arrange for their web hosting of audio and video files, and give URL links to all such files within the text of the submitted paper.

Submissions do not have to be anonymized, as we use single-blind reviewing. Each submission will be reviewed by at least three program committee members.

Workshop papers will be published as MUME 2019 Proceedings and will be archived with an ISBN number. Please use the updated MuMe paper template to format your paper. Also please feel free to edit the licence entry (at the bottom left of the first page of the new template). We created a new MUME 2019 template based on AAAI template. The MUME 2019 latex and Word template is available at:

Submission should be uploaded using MUME 2019 EasyChair portal:

For complete details on attendance, submissions and formatting, please visit the workshop website:

Presentation and Multimedia Equipment:
We will provide a video projection system as well as a stereo audio system for use by presenters at the venue. Additional equipment required for presentations and demonstrations should be supplied by the presenters. Contact the Workshop Chair to discuss any special equipment and setup needs/concerns.

It is expected that at least one author of each accepted submission will attend the workshop to present their contribution. We also welcome those who would like to attend the workshop without presenting. Workshop registration will be available through the ICCC 2019 conference system.
MUME 2019 builds on the enthusiastic response and participation we received for the past occurrences of MUME series:
MUME 2012 (held in conjunction with AIIDE 2012 at Stanford):
MUME 2013 (held in conjunction with AIIDE 2013 at NorthEastern):
MUME 2014 (held in conjunction with AIIDE 2014 at North Carolina):
MUME 2016 (held in conjunction with ICCC 2016 at Université Pierre et Marie Curie):
MUME 2017 (held in conjunction with ICCC 2017 at Georgia Institute of Technology):
MUME 2018 (held in conjunction with ICCC 2018 at Salamanca University):

Questions & Requests
Please direct any inquiries/suggestions/special requests to one of the Workshop Chairs, Bob ( or Bob (

Workshop Organizers

Program Co-Chair
Robert M. Keller, Professor
Computer Science Department
Harvey Mudd College
301 Platt Blvd
Claremont, CA 91711 USA

Program Co-Chair
Bob L. Sturm, Associate Professor
Tal, Musik och Hörsel (Speech, Music and Hearing)
Lindstedtsvägen 24
School of Electronic Engineering and Computer Science
Royal Institute of Technology KTH, Sweden

Concert Chair
Gus Xia, Assistant Professor
Computer Science
NYU Shanghai

Publicity Chair
Dr. Oliver Bown
Senior Lecturer
Faculty of Art & Design, The University of New South Wales
Room AG12, Cnr Oxford St & Greens Rd,
Paddington, NSW, 2021, Australia



MUME Steering Committee

Andrew Brown, Griffith University, Australia
Michael Casey, Dartmouth College, US
Arne Eigenfeldt, Simon Fraser University, Canada
Anna Jordanous, University of Kent, UK
Bob Keller, Harvey Mudd College, US
Róisín Loughran, University College Dublin, Ireland
Philippe Pasquier, Simon Fraser University, Canada
Benjamin Smith, Purdue University Indianapolis, USA

Machine Folk is Spreading!

Here’s a wonderful performance of three machine folk tunes played by The Continental Ceili Combo (Tijn Berends and Bas Nieraeth). This mini-concert happened at the IMPAKT “Robots Love Music” exhibition at the wonderful Museum Speelklok in Utrecht, Oct. 3 2018.

The first two tunes they play are “The Hills Of Dorrectance” and “The Temples Of The Warvon“, both by folk-rnn v1 (and also titled by the model). The third tune they play is #3809 by folk-rnn v3.

Thanks to Luba Elliott and the Impakt Festival for making this event happen!

Horses in Umeå!

I’m happy to be speaking at the 6th Swedish Workshops on Data Science at Umeå University, Nov. 20-21 2018.

Title: Be a responsible data scientist: Identify and tame your “horses”

Abstract: A “horse” is a system that is not actually addressing the problem it appears to be solving. The inspiration for the metaphor is the real-life example of Clever Hans, a horse that appeared to have great skill in mathematics but had actually learned to respond to a prosaic cue confounded with the correct answer. Similarly, a model created through the statistical treatment of a large dataset and wielded by a data scientist can also appear successful for solving a complex problem, but  actually not be. In this talk, I take a critical look at past applications of data science — exemplifying contemporary practices — and identify where issues arise that affect the validity of conclusions. I argue that the onus is on the data scientist to not stop at describing how well a model performs on a given dataset (no matter how big it may be), but to go further and explain what they with their models are actually doing. I provide some examples of how researchers have identified and tamed “horses” in my research domain, music informatics.