Horizon Blog

Thinking about Responsible Research and Innovation and Artificial Intelligence in Context: Reflecting on Participatory Design and Ethnography as an approach to Ethical Enquiry.

In this blog Alan Chamberlain, Senior Research Fellow, Mixed Reality Lab at the University of Nottingham discusses his research interests, how he came to be interested in responsible research and innovation (RRI) and in being a member of the Responsible Digital Futures forum . Alan is a Principal Investigator on the Trustworthy Autonomous Systems TAS RRI Project with Chris Greenhalgh, Professor of Computer Science, University of Nottingham.

Reflecting on RRI

I believe that it is important to think about the nature of enquiry in an ethical way, placing it within the context of Responsible Research and Innovation (RRI). As a response to developing this and the approaches that we might associate with RRI, it is important to reflect on the nature of existing work and the ways that emerging fields of RRI and Artificial Intelligence (AI) intertwine. This enables us to encompass more intersubjective, existential, and participatory approaches, allowing researchers and other stakeholders to discuss, appreciate and understand what lies behind the positions taken and practices which inform one’s stance when reasoning what responsibility is, in context. This approach can evidence how this reflects our social view of the world, providing a (socio) collective response to issues that impact on a setting in numerous ways, due to the nature of technologies and the ways they are implemented and practically used.

Project Led

I co-lead the TAS RRI Project, where I am predominantly focusing on RRI in a creative context and I am the Nottingham lead on two other TAS Projects: Co-Design of Context-Aware Trustworthy Audio Capture and TAS Benchmarks Library and Critical Review. Each of these have their own particular response to RRI, due to the in-situ nature of trust and responsibility.

 

 

 

 

 

Co-Design of Context-Aware Trustworthy Audio Capture

 

 

 

 

 

TAS Benchmarks Library and Critical Review

I am also involved in Experiencing the Future Mundane, a project led by Professor Andy Crabtree, University of Nottingham (working with Lancaster University and BBC R&D), which links to the Horizon Digital Economy Research Adaptive Podcasts project. We are taking The Caravan, our physical vision of a possible AI future out to the public, to see what they can tell us about how they view this future, and how this might lead us to develop more inclusive understandings of how people perceive this future. Our next stop will be in Salford, Manchester at an event with Lancaster Uni, the BBC and Salford Uni, followed by a possible visit to the Scottish AI Summit. We then move to Cardiff and Bristol, taking a UK-wide approach to engagement! This is really important, as we want to try and talk to people who don’t normally engage with future tech, it’s design and deployment.

 

 

 

 

 

 

 

 

 

 

Experiencing the Future mundane Caravan – Manchester MozFest 2021 (Blog post)

As part of the TAS project I have been leading on the Artist in Residence program (with Professor Steve Benford), and the project’s Cultural Ambassadors – Blast Theory, who are employing a strategy to widen participation and engage with a range of people. Working with the TAS RRI Project they have set up an Audience Advisory Board to make this happen. Recently we have seen research to suggest that there is a dwindling amount of working-class people employed in the creative industries, and having access to culture is another issue, as highlighted in this article and Getting in and getting on and Social Mobility and ‘Openness’ in Creative Occupations since the 1970s.

Cultural responsibility in research and organisationally is something that needs to be thought about, and indeed may be something that organisations have already considered. This may manifest itself in a range of ways; such as a civic mission, legislation (socio-economic duty on public bodies (Wales)) or as policies (which may be as a funder, grant holding organisation, a project-based policy or as an individual researcher). For example, UK Research and Innovation (UKRI) policies show the open, accountable nature of the organisation. When taking a responsible approach to engagement in research (in-situ) and dealing with issues that impact on people, whether it is about, class, age, or another aspect of one’s identity; to address responsibility in terms of these aspects is key to the perception of the validity of research, its trustworthiness, its adoption, and respect for those involved in the research. We have written about these approaches to research in our edited collection – Into the Wild: Beyond the Design Research Lab.

Developing Approaches to RRI & AI

By developing approaches, that for me personally have emerged from our previous work, focusing on long-term engagement studies in ‘the wild’, can start to offer a series of framings for other researchers to use: an initial point to develop and consider how they might approach engaging in RRI in their own work. Doing this can lead us to examine the nature of RRI, in and across a research project, from its conception and planning, through to how the research is carried out, the possible outputs of the project, products, publications, outcomes beyond the lifecycle of the research and exit strategies. Within this context I believe that ethnography, genuine engagement and participatory approaches to design are central to both developing and appreciating RRI in context.

I hope that by providing a few examples of things my colleagues and I are working on and relating these back to RRI, it has made people think about some of the issues that researchers face when starting to look at RRI on a personal level, and how to take an ethical research stance. The more we start to think about the ethics of research, the more we appreciate and understand this is multifaceted, sometimes very personal area (responsibility), which inevitably impacts upon our actions and research practices and the way we engage with organisations, communities, and individuals. By appreciating and thinking about responsibility we can be more inclusive, fair, accountable, and truthful as researchers.

Note: Autonomous systems and Artificial Intelligence (AI) are the framing for much of our current research work – a complex space, providing researchers with sets of technologies, with highly context specific requirements, which can exist in domains that include diverse stakeholders, and possible interactions that can be adaptive, autonomous and learn (in-situ and a priori (without context)).

In terms of responsibility i is worth thinking about the changing and evolving research landscape, one thing that I think that researchers will be interested in is the commitment of UKRI to supporting skills and talent, and in particular the Concordat to Support the Career Development of Researchers, which has also been adopted by the University of Nottingham.

Please get in touch if you want to discuss more with Alan.