(2016) [HTML] "Towards Machine Musicians Who Have Listened to More Music Than Us: Audio Database led Algorithmic Criticism for Automatic Composition and Live Concert Systems" Computers in Entertainment 14(3): 1-14 (Musical Metacreation Special Issue Part 2)(2014) "Virtual musicians and machine learning: Towards musical familiars and touring AIs" Book chapter in Karen Collins, Bill Kapralos and Holly Tessler (eds.), The Oxford Handbook of Interactive Audio. New York: Oxford University Press
(2011) [PDF] "LL: Listening and Learning in an Interactive Improvisation System", Research Report
(2007) Musical Robots and Listening Machines. Book chapter in N.Collins and J. d'Escrivan (Eds.) (2007) The Cambridge Companion to Electronic Music(2006) [abstract/citation info/download] "Towards Autonomous Agents for Live Computer Music: Realtime Machine Listening and Interactive Music Systems" PhD Thesis. Centre for Science and Music, Faculty of Music, University of Cambridge
(2006) "BBCut2: Incorporating Beat Tracking and On-the-fly Event Analysis", Journal of New Music Research 35(1); pp 63-70
(2004) [PDF] "Beat Induction and Rhythm Analysis for Live Audio Processing: 1st Year PhD Report" Centre for Science and Music, Faculty of Music, University of Cambridge. Released 18/06/04
Invited Talks and Selected Concerts
See the music section on this site as well
I gave a piano and electronics recital at Anglia Ruskin in Cambridge as part of the lunchtime recital series at the Mumford Theatre, on March 13th 2009, where I premiered prototypes of two new machine listening works
December 2006. Another autonomous interactive musical machine, DrumTrack, played with Matthew Yee-King at a LAM concert in Goldsmiths Great Hall.
October 2006. My autonomous interactive musical machine, the Ornamaton, appeared live at the Royal College of Music on October 5th.
March 2006. Baroqtronica: The Art of Machine Listening. Concert at Kettle's Yard, Cambridge, demonstrating autonomous artificial performers interacting with baroque musicians.
Sept 2005. Creative Applications of Music Information Retrieval Panel (headed by Michael Casey) ISMIR2005
April 2005. "The Problem of Phase Alignment in Computational Beat Induction on Musical Audio Signals" Talk at the second entrainment workshop, Ohio State University.
Feb 2005. Machine Enhanced Improvisation: concert in West Road demonstrating improvisation systems built with technology to be described in the thesis.
Oct 2004. "Computational Implementations of Beat Induction ". Talk at the first entrainment workshop, CMS.
About Computational Beat Tracking
Computational beat tracking is the use of a computer to "extract" the metrical structure of music (actually a culturally consensual and cognitive construction), typically just from an audio signal. In certain musicologically unambiguous situations, this can correspond to the location of beats where the average listener would clap their hands or tap their foot along to a musical signal. In more complicated settings, the extraction of metre is accomplished, involving the identification of multiple metrical levels, any duple/triple hierarchical structure, or even other metrical frameworks outside of the canon of conventional Western music theory; phase and period (as well as further marking events or patterns) must be determined in more difficult cases. For instance, beat tracking of Balkan dance music (aksak) or Norwegian Hardanger fiddle music would require the resolution of higher level patterns than any simple isochronous beat.
Symbolic beat tracking operates on symbolic data, such as a MIDI representation, and is distinct from audio beat tracking, which deals with an audio signal and must be based in some form of auditory frontend. It is not necessarily the case that beat tracking requires the symbolic transcription of the latter signal, for the average listener may also not decompose music into perfectly discrete events.
Methods of computational beat tracking are varied, from rule based systems for symbolic data, to correlation and oscillator methodologies on audio feature data. In engineering practise, the beat tracking model that infers the current metrical state may be distinct from the observation frontend that collates evidence from the audio stream.
For more of my thoughts on these topics see the ICMPC2006 paper above, or my PhD thesis, which also contain plenty of references to the work of other researchers.