Nick Collins

Home [Music] Research {Software} [Teaching] Contact


(2019) Nick Collins and Shelly Knotts [PDF] "A Javascript Musical Machine Listening Library". Proceedings of the International Computer Music Conference, New York

(2016) [HTML] "Towards Machine Musicians Who Have Listened to More Music Than Us: Audio Database led Algorithmic Criticism for Automatic Composition and Live Concert Systems" Computers in Entertainment 14(3): 1-14 (Musical Metacreation Special Issue Part 2)

(2014) "Virtual musicians and machine learning: Towards musical familiars and touring AIs" Book chapter in Karen Collins, Bill Kapralos and Holly Tessler (eds.), The Oxford Handbook of Interactive Audio. New York: Oxford University Press

(2011) [PDF] "LL: Listening and Learning in an Interactive Improvisation System", Research Report

(2011) [PDF] "Trading Faures: Virtual Musicians and Machine Ethics", Leonardo Music Journal 21: 35-39. Copyright © MIT Press 2011 [LINK]

(2010) [PDF] "Contrary Motion: An Oppositional Interactive Music System". Proceedings of NIME 2010, Sydney

(2010) [PDF] Nick Collins, Chris Kiefer, Zeeshan Patoli and Martin White. "Musical Exoskeletons: Experiments with a Motion Capture Suit". Proceedings of NIME 2010, Sydney

(2008) [PDF] "The Potential of Reinforcement Learning for Live Musical Agents". Proceedings of MML2008, ICML, Helsinki.

(2008) [PDF] "Reinforcement Learning for Live Musical Agents". Proceedings of ICMC2008, International Computer Music Conference, Belfast.

(2007) Musical Robots and Listening Machines. Book chapter in N.Collins and J. d'Escrivan (Eds.) (2007) The Cambridge Companion to Electronic Music

(2006) [abstract/citation info/download] "Towards Autonomous Agents for Live Computer Music: Realtime Machine Listening and Interactive Music Systems" PhD Thesis. Centre for Science and Music, Faculty of Music, University of Cambridge


(2007) [PDF] "Matching Parts: Inner Voice Led Control for Symbolic and Audio Accompaniment". Proceedings of NIME

(2006) "BBCut2: Incorporating Beat Tracking and On-the-fly Event Analysis", Journal of New Music Research 35(1); pp 63-70

(2006) [PDF] "Towards a Style-Specific Basis for Computational Beat Tracking". Proceedings of ICMPC2006. International Conference on Music Perception and Cognition, Bologna

(2006) [PDF] "Investigating Computational Models of Perceptual Attack Time". Proceedings of ICMPC2006. International Conference on Music Perception and Cognition, Bologna

(2005) [PDF] "A Change Discrimination Onset Detector with Peak Scoring Peak Picker and Time Domain Correction". Music Information Retrieval Exchange MIREX2005

(2005) [PDF] "Using a Pitch Detector for Onset Detection". International Conference on Music Information Retrieval ISMIR2005

(2005) [HMTL] Nick Collins and Ian Cross. "Beat Tracking and Reaction Time". Rhythm Perception and Performance Workshop RPPW10

(2005) [PDF] "An Automated Event Analysis System with Compositional Applications". Proceedings of ICMC2005, International Computer Music Conference, Barcelona.

(2005) [PDF] "DrumTrack: Beat Induction from an Acoustic Drum Kit with Synchronised Scheduling". Proceedings of ICMC2005, International Computer Music Conference, Barcelona.

(2005) [PDF] "A Comparison of Sound Onset Detection Algorithms with Emphasis on Psychoacoustically Motivated Detection Functions". Proceedings of AES118 Convention

(2004) [PDF] "On Onsets On-the-fly: Real-time Event Segmentation and Categorisation as a Compositional Effect". Proceedings of SMC04 ,Sound and Music Computing, IRCAM, Paris, Oct 20-22.

(2004) [PDF] "Beat Induction and Rhythm Analysis for Live Audio Processing: 1st Year PhD Report" Centre for Science and Music, Faculty of Music, University of Cambridge. Released 18/06/04

Invited Talks and Selected Concerts

See the music section on this site as well

I gave a piano and electronics recital at Anglia Ruskin in Cambridge as part of the lunchtime recital series at the Mumford Theatre, on March 13th 2009, where I premiered prototypes of two new machine listening works

December 2006. Another autonomous interactive musical machine, DrumTrack, played with Matthew Yee-King at a LAM concert in Goldsmiths Great Hall.

October 2006. My autonomous interactive musical machine, the Ornamaton, appeared live at the Royal College of Music on October 5th.

March 2006. Baroqtronica: The Art of Machine Listening. Concert at Kettle's Yard, Cambridge, demonstrating autonomous artificial performers interacting with baroque musicians.

Sept 2005. Creative Applications of Music Information Retrieval Panel (headed by Michael Casey) ISMIR2005

April 2005. "The Problem of Phase Alignment in Computational Beat Induction on Musical Audio Signals" Talk at the second entrainment workshop, Ohio State University.

Feb 2005. Machine Enhanced Improvisation: concert in West Road demonstrating improvisation systems built with technology to be described in the thesis.

Oct 2004. "Computational Implementations of Beat Induction ". Talk at the first entrainment workshop, CMS.

About Computational Beat Tracking

Computational beat tracking is the use of a computer to "extract" the metrical structure of music (actually a culturally consensual and cognitive construction), typically just from an audio signal. In certain musicologically unambiguous situations, this can correspond to the location of beats where the average listener would clap their hands or tap their foot along to a musical signal. In more complicated settings, the extraction of metre is accomplished, involving the identification of multiple metrical levels, any duple/triple hierarchical structure, or even other metrical frameworks outside of the canon of conventional Western music theory; phase and period (as well as further marking events or patterns) must be determined in more difficult cases. For instance, beat tracking of Balkan dance music (aksak) or Norwegian Hardanger fiddle music would require the resolution of higher level patterns than any simple isochronous beat.

Symbolic beat tracking operates on symbolic data, such as a MIDI representation, and is distinct from audio beat tracking, which deals with an audio signal and must be based in some form of auditory frontend. It is not necessarily the case that beat tracking requires the symbolic transcription of the latter signal, for the average listener may also not decompose music into perfectly discrete events.

Methods of computational beat tracking are varied, from rule based systems for symbolic data, to correlation and oscillator methodologies on audio feature data. In engineering practise, the beat tracking model that infers the current metrical state may be distinct from the observation frontend that collates evidence from the audio stream.

For more of my thoughts on these topics see the ICMPC2006 paper above, or my PhD thesis, which also contain plenty of references to the work of other researchers.