In this class students develop interactive music projects: pieces of music that are not linear, but offer multiple dimensions for listeners to explore. Applications include generative music installations, novel instruments, participative performances, museum exhibitions, musical games, and tools for producing and teaching music.
The first half of the semester, the class format is a combination of lectures, exercises on musical design and coding, and reading discussions, culminating on a midterm project. On the second half of the semester, the class shifts to a more self-directed project development studio format.
For their final projects, students will take a project from concept to execution over several iterations, applying interaction design, creative coding, and music production tools and techniques. The project development process will include gathering aural references, creating user paths and developing focused studies to explore interaction with specific musical elements. This work will inform the design and implementation of a functional prototype which students will evaluate and refine to produce their final piece. Professional practitioners will come in to share their work in the field and provide feedback to students.
While in-class examples and tutorials will be done in Ableton Live, Max/MSP and Max for Live, students will be free to use other languages and frameworks for their final projects. Supporting examples and resources will be made available for web-based projects (p5.js + Tone.js), and physical projects (Arduino-compatible micro-controllers).
Coming in with a specific project to develop is welcome; conceiving a project during the class is encouraged, too. Some experience in making or producing music will be useful, but is not required. ICM and Physical Computing or equivalent experience are required.
Create a first prototype of your experience concept (or a different, new idea, starting from any of Verplank's 'concerns'):
Digital Audio Workstation: Ableton Live +
Programming Languages, Libraries, Environments:
Elements of rhythm:
Inspiration piece: Clapping Music by Steve Reich
Discussion: Fluid formats, music as invention (notes and prompts here)
Interfaces: interactivity, visualization, probability, learning.
Design several interactions that focus on exploring rhythm / time grids. Prototype one of them. Post documentation here.
Elements of melody:
Listen + Ideate:
Rhythm explorations presentations
Representation + Manipulation, continued:
Patches, M4L devices and Ableton Session from class are here.
Design and prototype an interaction that focuses on exploring melody - pitches, scales, intervals. Post documentation here.
Elements of sound: pitch, timbre, volume
Synthesis Techniques: Additive, Subtractive, and FM synthesis
Read: The Importance of Parameter Mapping in Electronic Instrument Design by Marcelo Wanderley et al.
Option 1. Design and prototype an interaction that focuses on exploring timbre, eiter via synthesis or sampling.
In case you could use some direction/inspiration, here is a suggested exercise, based on the reading on Mapping:
If you are more interested in acoustic sounds or sampling, you can focus on samples/recordings instead of synths.
Option 2. Pick one of your previous explorations and develop it further.
Post documentation here.
Next week, everyone will present a project in class - either their timbre exploration, or an extension of a previous one.