Adaptive Interactive Movies (AIM)

The ethical challenges and cultural opportunities of Affective Media

More interactive films are now being produced. However, the prevalent interactive methods have not changed since the 1960s. Previous work has used BCI technology to produce adaptive interactive cinema [Pike 2016, Ramchurn 2018, 2019, 2020]. While producing initially promising results, the public uptake of BCI technology is a barrier to reach a wide audience.

This project will investigate the processes involved in the creation of adaptive movies that collect and use audiences’ real-time personal data, while being considerate of the privacy and ethical implications of the scenario

We propose using the more ubiquitous technologies of front-facing cameras on laptops and personal devices to derive physiological affective data by implementing computer vision and machine learning techniques. This raises a number of ethical, trust, network, and machine learning challenges which must first be uncovered, addressed and incorporated into the design of any adaptive system before it can be evaluated. 

Partners will shape the research and also showcase practice and recommendations to their professional networks. Alignment with the large-scale LEADD:NG project will provide further opportunities to share research findings with businesses.

A prototype will be designed for exhibition in the wild, suitable for both in-person and online creative XR festivals.

Project Team: Richard Ramchurn, Sarah Martindale, Mani Kumar Tellamekala, Sameh Zakhary, Neelima Sailaja, Jo Parkes, Max Wilson, Aleksandra Landowska, Giovanni Schiazza

Partners: AlbinoMosquito & Kino Industries

Start date: June (18 months)

This project sits within Horizon’s Co-Production Campaign

Co-Production Campaign blogsite