Title: Composing
1Composing PerformingInteractive Music
- October 18-23, 2004
- Faculty of Music
- McGill University
Bruce Pennycook, DMA Dept. of Composition, School
of Music Dept. of Radio Television Film, College
of Communication University of Texas at Austin
2Introduction
3Schedule Topics
- Monday 3-6 Principals and Aesthetics of
Interactive Music - Tuesday 9-12 Impact of interactive music on
performance practice and - on compositional methods
- Wed 9-12 Systems Design (composer/performer
perspective) - Thur 9-12 New directions and possibilities
- Friday 9-12 Music and Audio Visualization -
interactive, real-time video-music
4Objectives of the Seminars
- Present an overview of the history of interactive
composition and performance - Provide in depth resources and materials for
graduate level study and research - Propose new areas of creativity and research from
emerging aesthetics, styles and technologies
including visualization systems
5Seminar Format
- Presentations by B. Pennycook
- Examples and demonstrations
- Daily contributions by students
- Short paper (due October 31, 2004)
6Resources
- Read - Rowe, Winkler
- CMJ articles
- Recordings
- Max, Max/MSP SuperCollider 2, 3
7Seminar 1Principals and Aesthetics of
Interactive Music
(who will turn the knobs when I die)
Monday, October 18, 2004 300 pm - 600 pm Room
LSR3
8Principals and Aesthetics of Interactive Music
- What is interactivity ? (he says/she says)
- Are there pre-computer examples ?
- What about live electronic music ?
- Why are these early pieces different from our
current thinking on this topic? That is, what
role does the computer play in this? - What separates computational control from machine
control?
9Principals and Aesthetics of Interactive Music
- What is the essential attraction of interactive
music? Why do composers (and for that matter,
performers) want to create/play it? - How has the software community influenced music
making and vice-versa? - What connection exists (or should exist) with
other forms of interactive art?
10Principals and Aesthetics of Interactive Music
- Tape instruments/voices
- Instrumental karaoke (music minus n)
- Free the player from the tyranny of the clock
(tape, cd) - Like asking an actor to perform to video or film
- Eliminates the normal elasticity of gestures
especially on the large scale (local elasticity
is possible - like Chopin LH/RH) rubato has to
be built in to the score - No error tolerance - clock keeps ticking no
matter what
11Principals and Aesthetics of Interactive Music
- BUT - (Music - n) works!
- Many very important pieces (Berio, Davidovsky,
Risset, Lanza, Morrill, Parmagiani, ) that remain
performed - Low-stress rehearsal environment
- Easily replicated performance after performance
- Players can memorize aural events very precisely
- Players can emulate interaction convincingly
- Sonic domain can be managed easily
- Low gear, setup, help environment
12Principals and Aesthetics of Interactive Music
- Principals of interactive music
- Establish a dialogue between man and machine
- Permit modes and levels of adaptation
- Seek new performance results - from slightly
interpretive to full improvisation - Seek new or at least dynamic compositional
results - Explore relationship between man and machine to
some degree - Explore machine autonomy to some extent
13Principals and Aesthetics of Interactive Music
- Temporal Control
- What are the key interpretive issues?
- What degree of flexibility is required?
- What are the macro/micro level of temporal
controls that would have direct and audible
impact on the listener? - Why not just fake it?
- How can this be managed with minimal impact on
the player or ensemble?
14Principals and Aesthetics of Interactive Music
- Computer assisted composition.
- Grab/alter/play metaphor is very seductive given
the inherent compositional cohesion that ensues.
But despite some clever efforts (Risset,Rowe,
Pennycook, Pinkston, Hamel, Chafe, others) -
these have not been embraced into the
repertoire) - Is this a MIDI thing? Is MIDI dead?
- If yes, why? What problems exist with this method
of work that prohibits it from finding a solid
niche in the overall ea or chamber music genre?
15Principals and Aesthetics of Interactive Music
- Guided Improvisation
- http//smc04.ircam.fr/ProgWorkshop.html
- Workshop on computer improvisation, Oct 20 IRCAM
- What is guided improvisation
- In what sense does a player actually improvise
- In what sense can a computer improvise
- One-way, two-way, n-way improvisation
- Stylistic norms and boundaries help manage
conditional environment
16Principals and Aesthetics of Interactive Music
- Guided Improvisation
- Essential criterion for improvisation is the
ability to listen - Application of analytic/generative processes to
the incoming musical information - Segmentation, feature extraction and pattern
matching - Modular compositional automaton
- Other factors such as parallel process
management, multi-computer audio, midi and data
interfaces
17Principals and Aesthetics of Interactive Music
- Audio Processing Compositions
- Modes of operation
- outboard rack approach
- New audio from previous audio (lag issues)
- Stored files
- Stored audio altered on the fly as per incoming
data - Real-time pitch tracking, segmenting, pattern
matching (huge topic to be returned to later in
the week) - Computational demands much higher than MIDI hence
must be worth it
18Principals and Aesthetics of Interactive Music
- General Properties - Summary
- Modes of performance
- Active - triggers, footswitch, etc. initiated
by operator and/or performer(s) - Passive - system detects appropriate flags from
processes such as beat detection, pitch
detection, silence/pause detection, motion
capture, time-code (clock) etc. - Granularity
- Section, event, note, clock time,
- Governed by compositional style and process
compexity
19Principals and Aesthetics of Interactive Music
- General Properties - Summary
- Input and Sources
- MIDI - seems passé now but why?
- Audio - transducer properties, analytical,
processing and generative algorithms - Motion/Image - use of gesture may be critical to
effective interaction secondary channel? - Output
- MIDI? - this is really dead
- Audio - channels, mixing, loudspeaker management
- Video/Image - supporting
20Principals and Aesthetics of Interactive Music
- General Properties - Summary
- Longevity (who will turn the knobs)
- Very few pieces are playable without the composer
or trained operator present - Players largely disenfranchised due to tech-gap
- No obvious solution to hardware/software
obsolescence - Many components defy notation or even adequate
description - Teachers will never (?) undertake this repertoire
thus the cycle of master-apprentice is
essentially broken beyond repair - Many acoustic composers consider this little more
than gear-tinkering - No systematic reviewing process
- Vast arena of techno-pop has totally overshadowed
the genre
21Seminar 1Principals and Aesthetics of
Interactive Music
END
Monday, October 18, 2004 300 pm - 600 pm
22Seminar 2Composition and Performance of
Interactive Music
(the chamber music tradition)
Tuesday, October 19, 2004 900 am - 1200 pm Room
LSR 1
23Interactive Compositions
- Early adoptors (MIDI)
- Roger Dannenburg, Chris Chafe, Joel Chadabe,
Dexter Morrill, Keith Hamel, Russell Pinkston,
Jean-Claude Risset, Morton Subotnick. - Early adopters (Audio/DSP)
- Cort Lippe, Zack Settel, Tod Machover, Russell
Pinkston, . - Reference - Joel Chadabe .
24Interactive Compositions
- Pennycook PRAESCIO series
- First public performance - Buffalo, April 1987 -
Praescio I
25Praescio - I
- Soprano saxophone
- Original version constructed with Dannenburg
software cmidi (?) on PC-AT/286. - Setup included
- PC/at with MIDI IN, IVL Pitchrider
- Sax data processed with delays, harmonizations,
etc. - PC/at with midi version of score-11 developed by
BP and CS grads at Queens University, called
M-SCORE - Files were hand triggered on a
section-by-section basis - Extreme underflow occurred during performance
causing bursts and cascades
26Selected Compositions Praescio-I Rec. 1991, McGill
27Praescio - I
- Versions II, III
- Reconstructed using MIDI-LIVE software developed
at McGill (Pennycook, Fujinaga, Hillborn,
Quesnel) - Current version - Max
- (more on this tomorrow)
28Tornado (McGill EMS) Praescio-II amnesia
29Praescio - II amnesia
- Commissioned by Geoffrey Wright for 25th
anniversary of Peabody Conservatory EMS - With Morton Subotnick Jacobs Room
- Instrumentation, System
- Soprano, flute, vln, vcello, dx7, system
- First version of MIDI-LIVE software (Low-latency
MIDI composition system for real-time
performance) - Soprano and flute were close micd and provided
pitch data to software via 2 IVL Pitchriders - Stored sequences were triggered (by operator)
- MIDI Channel management was the crucial component
30MIDI-LIVE 0.8
- Designed to permit fluent interchange of live
data with stored data - Premise was that MIDI files could be played like
pieces of tape - Transformations included
- Assign out channel(s)
- Assign tempo, velocity (volume) on a per-track
basis - Specifiy harmonization, transposition
- Gather inmcoming note-ons, strip temporal info,
resend in various ways - Any number of tracks could be active at a time
all under their own local metronome - Scripting language playback interface for live
performance that showed channel activity - Read/process standard midi files produced from
sequencer, M-SCORE (score11/midi) OR recorded and
stored internally
31PRAESCIO-III The desert speaks Vivien Spiteri,
Harpsichord 1989
32Praescio-III the desert speaks
- MIDI-LIVE 1.0 - program, much more stable, more
processing capabilities, better user interface
for scripting - Praescio-III harpsichord and midilive
- Challenge as the harpsichord - first interface
was developed with Eric Johnstone at McGill using
organ retrofit midi package with a complete set
of switches for upper manual - Small control unit attached to harpsichord to
permit the player to reset, advance, etc. and
manage fswitch and vol pedal (critical for the
performance)
33Praescio-III the desert speaks
- Version 2 of the interface was built for concert
in Europe - original interface stolen out of a
van (very high return for sure!) - New version entirely optical - individual
LED/Receptor pairs per note on upper manual - Worked OK but susceptible to sudden lighting
changes! - Never truly debugged, hard to regulate (but
better than the mechanical one)
34Praescio-III the desert speaks
- 3 movement format
- I - colorize
- II - record/strip/process/play
- III - triggered sequences, colorize
35PRAESCIO-IV Jean-Guy Boisvert, Clarinet 1991
36Praescio-IV
- Commission for the 1991 International Clarinet
Conference by Jean-Guy Boisvert - Challenge was to provide clarinetist with maximum
freedom of control over temporal components - Non-improvisatory
- Cheap MIDI tone generators to facilitate travel
(but that may have been a bad idea)
37Praescio-IV
- Unique harness for the clarinet designed and
built by BP and Eric Johnstone - Provided attachment (DIN) for
- Contact mic on reed to improve IVL tracking
- 3 ultra-light keys placed by LH thumb and RH
little-finger to permit cross-fingered sustain
and trigger - Volume pedal on floor was unavoidable then,
perhaps with gesture tracking this too could be
eliminated - Images in CMJ
- Performed successfully by many different players
- learning curve very short using the device and
score cues
38Praescio-V
- Praescio-V - a kind of joint performance piece
for Dexter Morrill and myself - 1990 MIDI-LIVE road tour in Europe/Eastern
Europe - Trumpet, sax and small rack of midi tone
generators Yamaha DMP-11 midi controlled mixer
for processing audio (software controlled) - No longer extant but lots of fun to play.
- Note that Dexter Morrill made numerous
compositions using MIDI-LIVE and even shipped a
system around to performers. A version was made
for the Yamaha laptop that supported MIDI (not
the CX5).
39Praescio-VI
- Praescio-VI commissioned by Christine Little,
Toronto flautist - Several performances by different players -
Montreal, Toronto, Ottawa, ICMC-San Jose, Mexico
City - Fairly stable short learning curve
- 4 innovations for this piece
- Max version of MIDI LIVE under development
- MIDI Time Clip (remote signaling device to be
described tomorrow) - Use of audio-on-CD as part of the controlled
environment, more than MIDI output - Digidesign Sample-Cell hence entirely internal to
the Mac - But, some serious level issues, hard to control
in real-time
40Praescio-VII (piano and them some..) alcides
lanza, piano
41Praescio-VII
- Praescio-VII (piano and then some) commissioned
by ACREQ for alcides lanza to perform - many performances by different players plus tours
in Europe, South America by lanza - MIDI Time Clip crucial for both the conception
of the piece and the performance - very difficult
to perform without feedback from computer-player
42Praescio-VII
- Most complex of the Praescio series
- Midi data generated from Common Music/LISP
programs written by BP -- SMFs - Full max implementation of MIDI-LIVE 2.x
(Stammen) - Several specialized Max objects written by Dale
Stammen - MIDI in from triggers (no piano data)
- Feedback to player with MIDI Time Clip, complete
Time Clip software package (Pennycook/Stammen)
43Praescio-VII
- Large array of piano tone modules for midi out
- Dual CD under complex Max control to permit
overlap - 8 audio outs and 8 loudspeakers with real-time
placement using MIDI-controlled MIXER (simple
unity gain device - no EQ) - Full integration of prerecorded audio tracks and
prepared MIDI sequences - Temporal management of triggers only - no
improvisation - More than 60 events
44Other mixed pieces
- The Black Page Tropes (1995)
- Drums, percussion, midi out audio
- One section of triggered improvisation using
loops derived from Pyhrite external in Max - Long complex work primarily for the players -
audio/MIDI more supportive and commentary - The Yonge Street Variations (1998)
- Cello, MIDI (drum head), audio, DSP
- Less notes, more processing and sound
- Much greater reliance on stored audio files
triggered by player - Based on very early work for viola and percussion
(recorded)
45Summary
- Much was learned from the development of all
these works - End of the MIDI era (almost)
- 8 audio outs and 8 loudspeakers with real-time
placement using MIDI-controlled MIXER (simple
unity gain device - no EQ) - Full integration of prerecorded audio tracks and
prepared MIDI sequences - Temporal management of triggers only - no
improvisation - More than 60 events
46Seminar 3Interactive System Design
Wednesday, October 20, 2004 900 am - 1200
pm Room 806
47Design issues
- What is the definition of Tod Machovers
interactive solo cello piece for Yo-Yo Ma, Begin
again and again
48Design issues
- 1 cellist, 6 technicians 2 18-wheelers
- (1991 view)
49Brief History
- Risset - Duet for one piano
- Jaffe/Schloss - Wildlife
- Wessel - phrase recorder
- Lippe - Music for Clarinet and ISPW
- Dannenberg - CMU Midi Toolkit
- Rowe - Cypher
- Pennycook - MidiLive/Max, T-MAX, Listener Project
(with Hillborn, Stammen, Quesnel)
50Looking backwards
- Development of interactive, live-perf systems
- Max Software (version 2) 1990/91
- This program was written for 68k Mac.
- PlaySMF (Dale Stammen - superb Midilive
implementation for MAX) - Led to more ambitious implementations especially
T-MAX, a version of Rowes Cypher running across
a Mac IIfx and 4 Inmos T805 Transputers - Listener project - Stammen/Pennycook (see Rowe)
51MIDI-LIVE -Max
- Max Software (version 2) 1990/91
- This program was written for 68k Mac.
- PlaySMF (Dale Stammen - superb Midilive
implementation for MAX) - allnotesoff (Dale Stammen), case (Dale Stammen)
- (more on this tomorrow)
52Looking back
- Demo of playSMF
- Pre audio example
- Black Page Tropes
- Max - MIDI (playSMF) audio cds
- Event list driven, operator required (me).
- Drums, perc, system
- Interactive drum solo (Pyhrite code in Max)
53Design Criteria Today
- Compositional Strategy
- Improvisatory?
- Accompanying?(as in my stuff for the most part)
- Sound Art?
- Solo? Multiplayer?
- Multiple media types (visuals, video, dance,
lighting)
54Design Criteria Today
- Technical Strategy
- Small, portable? (G5 makes this almost a
non-issue) - Audio Only? Controller functions?
55Audio Pieces
- Panmure Vistas
- SC2, state driven knobs
- Requires operator for truly fluent presentation
- Solo violin sc2
opening
midpoint
ending
56(No Transcript)
57Audio Pieces
- Club Life (2003)
- SC2, state driven knobs, much more complex
software - Requires operator for truly fluent presentation
- 2 saxes, piano, system
- Not really interactive, just live
58(No Transcript)
59Audio Pieces
- Fast Dance (2004/05)
- Clarinet and Max/MSP (in progress)
- Commission from Jean-Guy Boisvert
- Interactive audio only, no MIDI
- Surround audio, 2-5 mics on stage (clarinetist
moves around somehow) - Want to avoid the grab and hack metaphor and
rack-in-a-box - Several highly intensity/rhythmic audio processes
initiated by player position on stage, pitch
(maybe), input volume (for sure) - Stored audio clips processed under algorithmic
control
60Summary
- Audio is different but not necessarily better
than MIDI as a compositional tool many clichés
to avoid - Multichannel audio leads to positional
information as data channel - If G5 dependent, I will still need someone to
turn the knobs when I die!
61Guest Performers
- alcides lanza (Praescio VII - piano and then
some) - Director of the McGill EMS
- Composer and performer of electroacoustic works
- Jean-Guy Boisvert (Praescio IV)
- Specialist in new music for clarinet
- Extensive touring experience with midi and audio
systems
62Guest Performers
- 30 minute round table on performance issues
63Seminar 4New Directions, Emerging Technologies
Thursday, October 21, 2004 900 am - 1200 pm LSR
2
64Seminar 4
- 0900-1045 Emerging Technologies
- 1100-1200 Performer Interview
65Seminar 4
- 0900-1045
- Emerging Technologies
- 1100-1200
- Performer Interviews
- alcides lanza, piano
- Jean-Guy Boisvert, clarinet
- Discussion of perforrmance practice of
interactive music (Praescio IV, VII)
66Emerging Technologies
- What would a unified interactive
performance/composition system look like? - What do we (creators) want to do?
- What are the aesthetic objectives?
67Emerging Technologies
- Single voice pitch tracking combined with
beat/rhythm tracking can produce stable and
reliable real-time data input.
68Emerging Technologies
- Real-time feature detection can segment and
identify macro-structures can be implemented
69Emerging Technologies
- Motion capture systems permit reasonable and
stable gesture capture such that movements can be
integrated into the overall data input
70Emerging Technologies
- Tracking of multiple audio sources can be
enhanced with various types of source separation
then subjected to other processes
71Emerging Technologies
- Technologies to control in real-time a wide
variety of devices - lighting, stage mechanicals,
video, audio diffusion - can be driven from a
variety of computer-mediated inputs (audio,
motion capture, other data sources)
72Emerging Technologies
- Through more elaborate, dynamic data-mapping
processes, musical input (audio, MIDI, motion)
can be translated into complex visualizations. - Note that visualizations and show-control
systems are merging under a few clean standards.
Large scale display systems called digital light
projection (DLP) that are bright enough for 500
seat theaters open up huge possibilities for
visualization. (more on this tomorrow).
73Emerging Technologies
- Gaming and audience participation technologies
begin to impact presentation spaces. - Perhaps not relevant in the aesthetics of the
academy or Fine Arts schools, audience
expectations are shifting.
74Emerging Technologies
- Theatrical spectacle now the norm for major shows
- Seen DVD - Nine Inch Nails (2003)
75Emerging Technologies
- WHO IS THE AUDIENCE???
- (break for 15)
76Seminar 4
- 1100-1200
- Performer Interviews
- alcides lanza, piano
- Jean-Guy Boisvert, clarinet
- Discussion of performance practice of interactive
music (Praescio IV, VII)
77Seminar 5Music Audio Visualization Systems
Public Presentation (90 minutes)
Friday, October 22, 2004 900 am - 1200 pm Room
LSR 3