Horizon Blog

Digitopia Experience – Technology

The Digitopia experience premiered on 12 February 2016 at Lakeside Arts Centre, Nottingham, alongside Digitopia the stage show. Digitopia is currently touring to 16 UK venues in total. It was developed in close collaboration with Nottingham-based Tom Dale dance company. Previous posts have covered the Digitopia Premiere, the collaboration with Tom Dale company, the initial design stages and the final design. Here we reflect briefly on the technology.

The design process involved discussion of pre-show and post-show access to the Digitopia Experience. This essentially led the development process towards a web-based design. Building an application in the browser offers many advantages – it allows a wide range of users to gain access to the experience regardless of what device they are using and regardless of what platform they are working on. It also enables the development to be performed more quickly as we only have to develop for the browser, we don’t need to develop for multiple operating systems.

The next step was looking at some of the visual designs that we developed so far, to see how easily they could be constructed and what form the functionality would take. It soon became clear that certain elements of the design were mandatory – there had to be a number of lines that needed to be placed to form a specific shape. This interaction would then trigger some form of musical output. It would also be advantageous, if time allowed, to make the experience slightly different on each tablet within the venue itself and possibly offer some form of harmonious synchronisation between users.

Being a web application this was implemented in HTML5, using CSS to style elements and relevant javascript libraries to implement the functionality. The graphical content was always going to be kept as simple as possible. This was attributable to a number of factors such as ease of design, the intended user group being 5 – 10 year olds and allowing the app to have a clean and simple design. With this in mind it was not considered to use any 3D graphics capabilities but use 2D shapes that may be dragged around to form the relevant shapes. This prompted the use of the 2D game library Phaser (www.phaser.io) which allows for 2D elements to be created and manipulated on screen. Initial designs suggested that each participant would have a number of lines at the bottom of the screen that they could then drag onto a pyramid-like structure. Each line dragged onto the pyramid would then instigate the playing of an associated sound or piece of music.

The drawing of the pyramid and circles at the end points is achieved via basic drawing commands on an HTML5 canvas. The lines in the greyed out portion of the screen are a selection of images that may easily be manipulated via the Phaser library. The additional functionality required was to determine when the lines were over the shape and “snap” them to the appropriate line. The converse must also be catered for whereby lines may be dragged from the pyramid and back to the starting positions. This will also have the effect of stopping the associated musical track. A reset button was also added such that the participant could go back to a stable state and things would be restored to the starting positions.

The obvious major development task was to determine how the musical score would be invoked and what structure this would take in relation to the designated shape. It was not merely enough to have a simple sound play each time the user dragged a line onto the pyramid. This would have been relatively simple and indeed there was some basic functionality to support this within the Phaser library. We actually needed a continually playing score of music that would somehow be enhanced with the addition of each line. We therefore decided to adopt the use of midi as a musical framework. This could be configured such that there would be six tracks of music playing (one for each line of the pyramid) that may subsequently me muted/unmuted as each line was added/taken away.

Midi is still relatively new in the realms of the web browser, but some javascript libraries do exist that can play midi tracks. Some notable libraries are the jasmid library (https://github.com/gasman/jasmid) which is a midi file reader and simple synthesiser and the midi.js library (https://mudcu.be/midi-js/) that utilised the former and added significant functionality as well as visual detail to demonstrate various midi playback techniques. The midi.js library was therefore used as the basis for the music playback within the Digitopia application. It should also be noted that these libraries do not offer midi functionality over being able to take a midi file and play it in the browser, typically via the web audio api (https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API). Hence we constructed a series of midi tracks that were designed to interact well with one another, such that the combination of multiple tracks playing together would hopefully create something harmonious as opposed to a cacophony of noise!

In a future post we will describe the musical composition for the Digitopia Experience.

The Digitopia team: Tony Glover, Adrian Hazzard, Holger Schnädelbach, Laura Carletti, Ben Bedwell

Tags: , , , , , , ,