Horizon Blog

Adaptive Interactive Movies – Partnership working – a Case Study

When University of Nottingham and Horizon Digital Economy Research academic and filmmaker Richard Ramchurn wanted to push the boundaries of how interactive a film could be, he turned to former colleague and entrepreneur Professor Michel Valstar.

Valstar’s start up, BlueSkeye AI, is a spin out from the University of Nottingham, designed to commercialise the Professor’s 18 years of experience in affective computing and social signal processing. Experience which has led to the publication of over 100 papers, cited over 16000 times.

BlueSkeye has developed technology that uses machine learning to scan the face and objectively, automatically and repeatedly analyse affective and social signals: face and voice data, to monitor an individual’s affective and socially expressive behaviour and infer emotion.

The company uses cameras to monitor the facial muscle movement underpinning facial expression, identifying how much those muscles are activated. The cameras also determine the direction of eye gaze, and the pose of the head. This brings objective measures such as the frequency and intensity of facial muscle actions, head actions, and social gaze to areas which have traditionally been dominated by subjective interpretations.  Combining these allows BlueSkeye to identify and assign a numeric value to how actively engaged the user is with the task in hand – in this case watching a film.  BlueSkeye AI call this “Arousal.”

They use the same approach to assess how positive or negative the user is feeling (Valence) and how able they feel to deal with the cause of the emotion (Dominance).

By plotting these three values with Valence on the x-axis, Arousal on the y-axis and Dominance as a depth to the plot, a point, or collection of points, within the three-dimensional space can be identified and given a label.  Typically, those commonly applied to emotion are used. For example, “excited,” “calm”, or ”angry”.  But given this analysis is performed over time and in a continuous 3-dimensional space, it can accommodate many more labels.

Ramchurn was excited by the opportunity the technology presented. According to Ramchurn, whilst more interactive films are being produced the prevalent interactive methods filmmakers are using have changed little since the 1960s. Ramchurn wanted to create an adaptive movie using ubiquitous technologies and audiences’ real-time reactions to the scene unfolding before them. In so doing, he would explore the processes involved in that creation whilst considering the privacy and ethical implications of the act.

Ramchurn’s previous work had used brain computer interfaces (BCI) to analyse signals from the brain allowing audiences to create a non-conscious edit of the film. This requires highly specialised equipment. The plan now was to exploit technology more readily available to audiences and cinematographers.

To get the ball rolling, as part of Horizon’s Co-Production Campaign project ‘Adaptive Interactive Movies’, Ramchurn drew on previous research work looking at both the type of emotion filmmakers try to elicit and the way in which viewers display emotion in reaction to film clips.

To make sense of the Arousal and Valence data which would be read from people’s faces, the filmmakers asked several questions. What do filmmakers intend for the audience to feel? What can the computer system observe? And how could that data be used to make a binary choice as to which scene will play next?

To discover the intended emotion, BlueSkeye AI Head of Research and Development, Mani Telamekala wrote an open source mark-up program, which the filmmakers used to manually mark up what they thought the intended viewer emotion was (Valence and Arousal) for specific scenes.

Ramchurn’s AIM project team chose clips, between 1 and 3 minutes long, from 16 genre films, which were representative of the genre of film they planned to make. They asked participants to watch those clips while recording their faces. These recordings were put through BlueSkeye’s B-Social software and Valence and Arousal data extracted. This gave an indication of how Valence and Arousal Data varies between people and clips, and within the clips themselves. It was apparent, says Ramchurn, that audiences tend not to extensively emote. However, Blueskeye’s software was sensitive enough to pick up small variations and emotional cues. On this basis the team went on to develop algorithms which used BlueSkeye’s Arousal and Valence data to contribute real time edit decisions, causing their film to reconfigure itself in response to audience reaction.

The result was AlbinoMosquito Productions’ “Before We Disappear,” an artistic response to human-caused climate collapse, starring actress/writer Jessica M Milford. The film previewed, as part of a two-day workshop organised by the LEADD:NG programme on interactive film at the Broadway Cinema in Nottingham on 24 February 2023.

Before we Disappear” is set in the year 2042 where the UK is now a tropical region. Floods, fires, storms and civil unrest are common. It follows two main characters who have been involved in a series of high-level assassinations using autonomous weapons built by climate activists. They designed a selection algorithm that chose the positions and people that were to be targeted and killed. This system assessed the unit of human suffering caused by any one person’s actions based on how responsible they were for causing environmental destruction and how much they personally benefited from it. The system calculated the minimum number of people that were to be targeted to save the most lives. It was an example of the trolly problem but on a global scale. And so, the system acted as a moral judge, jury and executioner.

The first scene of the film acts as a baseline against which the next scene is measured and so on. Then, depending on the audience response, the narrative will become one of around 500 possible edits. As a consequence, the film follows a non-linear narrative, which offers the audience different endings and emotional journeys.

It’s easy to see how and by whom this approach could be misused, Ramchurn says, to manipulate audience response towards a desired end, a specific purchase or action for example. The research aimed to provoke discussion about how users’ emotion data can be controlled by them and informed consent given for responsible use.

Here the project benefited from working with BlueSkeye. The company has incorporated privacy by design into its technology from its inception.  Data collection and storage is minimised wherever practical, and all data is processed on people’s own devices, without using the cloud. This gives users control over who they share their data with, when and always with end-to-end encryption.

The film is set to be released as an interactive app, incorporating an awareness of potential abuse of the user’s data, and safeguarding any personal data on the device used to watch it.

According to Ramchurn,

“Adaptive films offer an alternative to traditional ‘choose-your-own-adventure’ storytelling. Thanks to Horizon and BlueSkeye we were able to create a story that changed based on the audiences’ unconscious responses rather than intentional interaction, which might distract them from the film. This means they can enjoy a more personalised and more immersive experience of the film, whilst hopefully provoking discussion about the risks of and response to emotion AI”

Founding CEO of BlueSkeye, Professor Michel Valstar said,

“Over the years we have worked extensively with Horizon Digital Economy Research on several research projects supporting the creation of advanced digital technologies. Helping create a film which used our technology to capture, non intrusively, the audience’s emotional response to the film as it progressed, and provide the tools to create a truly interactive cinematic experience was a fascinating challenge. The fact that the film was also used to explore the very real ethical issues attendant on this technology was a bonus. All told we were very grateful to Horizon for bringing our two organisations together and creating something truly innovative.”

Media coverage of ‘Before We Disappear:

BWD ITV Central 220223

2023-02-28 14-44-58