The Digitopia experience premiered on 12 February 2016 at Lakeside Arts Centre, Nottingham, alongside Digitopia the stage show. Digitopia is currently touring to 16 UK venues in total. It was developed in close collaboration with Nottingham-based Tom Dale dance company. Previous posts have covered the Digitopia Premiere, the collaboration with Tom Dale company, the initial design stages, the final design and the technology used. Here we reflect briefly on the core ideas for the musical composition.
The interaction of graphical and musical elements was formed during an iterative development process. Initially, we looked towards prior research into musical and topological relationships. This led us to the Tonnetz, a conceptual visualization for the relationship of notes and harmonies within traditional western tonal music (see Figure 1). The parallels between this and the Digitopia pyramid shape were clear and intriguing. At first we considered a scenario where each line of the pyramid shape on the Digitopia user interface would represent a single note of a chord, thus when combined, the full sonority of the chord would sound. Another option on this theme considered the addition of each line enacting a change of chord harmony, using the relationships as set out in the Tonnetz.
Tonnetz (By Hyacinth (Own work) [CC BY-SA 4.0 (http://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons)
These initial ideas concerned the construction of musical harmonies and while they reflected the notion of the Tonnetz well they did not address how the music could unfold rhythmically, meaning that if each note (line) just sounded a continuous tone the musical content would quickly become monotonous. At this point we decided to seek inspiration and guidance from the Digitopia performance soundtrack. We met with Jo Wills, the Digitopia composer to discuss our ideas and to listen to some of the music he was composing for the show. Jo kindly offered to share some of his raw music with us. This included some specific musical themes and also the type of sounds he was using, which were a mixture of analogue synthesizers and some orchestral instruments. This enabled us to create a soundtrack for the tablet application that complimented elements of the show’s musical score. Following this meeting we worked on developing an approach where each visual line of the pyramid represented a different musical element (instrument) that when combined formed a single detailed musical arrangement. Whilst this drew us away from the strict relationships of the Tonnetz, its inspiration remained in that these musical elements were typically constructed around complementary harmonic content.
Once this first musical arrangement was completed we turned our attention to how we could make multiple different instances of this musical arrangement that could be broadcast on each of the tablets. This represented a challenge, as the music broadcast from each tablet would combine with that of the other tablets sounding in close proximity, thus these variations needed to work together musically. We took a simple approach to this by having the same music playing on each tablet – melody and rhythms – but using different sounds to present the six different musical elements. As a result each tablet broadcast a unique variation of the same music. We created six different MIDI files. Each MIDI contained a complete arrangement for a tablet, with each of the six musical elements mapped to a different MIDI channel. MIDI channels permit for individual control over musical themes within a complete arrangement, such as volume, pan, and mute.
Next we sourced our sound sources. MIDI is essentially a trigger based communication protocol that only instructs a sound source of what notes to play and when. Typically, sound sources in the MIDI domain are either samples (recorded notes from a real instrument such as a piano) or synthesized. We chose to create a SoundFont for each tablet. A SoundFont is a bank of sampled notes that contained a number of different sound sets (i.e. instruments). Our SoundFonts each contained six sound sets, which were mapped to the series of MIDI channels that represented each musical element (line).
In a future post we will complete the description of the musical composition of the Digitopia Experience.
The Digitopia team: Tony Glover, Adrian Hazzard, Holger Schnädelbach, Laura Carletti, Ben BedwellTags: children, Digitopia, interaction, interactive design, Media Flagship, performing data, theatre, visitor engagement