top of page

TECHNO-UTOPIA

TECHNO-UTOPIA, a concerto for piano, synthesizers and newly-created intelligent instruments, was commissioned by BBC Radio 3, for the BBC Philharmonic Orchestra, and by the Rundfunk Sinfonieorchester Berlin. It was premiered by the BBC Philharmonic, Zubin Kanga (soloist) and Jack Sheen (conductor) on 11th July, 2025, at the BBC Philharmonic Studio, Salford. It will receive its German premiere with the Rundfunk Sinfonieorchester Berlin, Zubin Kanga (soloist) and Vimbayi Kaziboni (conductor) on 11th October, 2026, at the Berlin Philharmonie.

​

I am in the process of publishing the research element of this project, but I give an overview below. Of course there are lots of aesthetic considerations I won't go into here, but which play a crucial role in the music that was written and why. 

​

TECHNO-UTOPIA was a research project that developed physical musical instruments with embedded AI models, designed in collaboration with the BBC Philharmonic and Berlin Radio Symphony Orchestra. The project responded to the contemporary context where AI has become controversial in the arts, with many musicians viewing it as a threat to human creativity.

​

Creative-First and Ethical Approach 

The project used only data with explicit permission - approximately 2,600 hours of BBC Radio 3 recordings featuring the BBC Philharmonic. This "local," site-specific data contrasted sharply with commercial models trained on scraped internet content. The approach reframed archives not as fuel for AI automation, but as historical records that AI could offer new perspectives on.

​

Through workshops with professional musicians, the project identified what performers actually wanted from creative AI:

  • Something that sounds different and new, not replicative

  • Highly responsive, tactile instruments (not abstract interfaces)

  • Reliable tools that behave logically in performance

  • Instruments that act somewhat familiar (like an improvising collaborator) but not too familiar

 

With the above in mind, I wrote music for three embedded AI Instruments which were designed or adapted for this project, with the main focus being on the first.

​

"Stacco" with RAVE

The Stacco is a physical instrument embedded with magnetometers which control an 8-dimensional latent space trained on the orchestral archive, creating what sounds to me like a timbral soup of the orchestra. The Stacco was designed by Nicola Privato and Giacomo Lepri, and we worked on this version of it for this piece at the Intelligent Instruments Lab in Iceland, along with Victor Shepardson, in November 2024. The latent spaces it controls were created with RAVE.

​

"Archive Dreamer"

Used machine listening to match live performers' notes with fragments from the BBC archive, exploring how humans and machines hear differently. Designed in MaxMSP using FluCoMa.

​

Modified ROLI Seaboard

I used the archive as a dataset for concatenative synthesis (see here). Source audio was provided by a ROLI seaboard, with finger pressure used to directly control the balance between source audio and reconstructed audio using the archive.

​

The project resulted in the first orchestral music using embedded AI instruments. Key creative discoveries included:

  • Improvisation became far more integrated into the compositional process

  • Questions of materiality and what music is "idiomatic" for certain materials

  • Exploration of memory and how embedded AI treats musical structure and time differently than humans

​

The orchestral musicians, anecdotally, seemed to enjoy, appreciate and support this approach to imagining this technology's role in their practice. I took this as a vote of confidence, as classical musicians can be sceptical about any technology associated with the term AI. I wrote more about this here. 

​

bottom of page