ELSI and Autonomous Systems seem to go together. Making Autonomous Systems trustworthy appears to demand ethical considerations, thinking about ways of regulating and making laws, and the societal impacts of those.
In our first two workshops with TAS-S, I worked with my colleagues in Research Strand 3 (RS3) to design activities that could help facilitating discussions about ELSI in relations to Security. Getting people talking is never an easy task, but it has been made even harder by the impact of the pandemic on our lives, and the move online.
With this in mind, we tried to keep things simple, but engaging. Participants in workshops rarely want a bunch of terminology dumped on them, and of course, they come with their own terminological repertoire. Designing engagements across disciplines – in this case, Sociology, Legal Studies, and Computer Science – really helps in refining and making activities accessible.
In our first stakeholder workshop (29th March 2021), I designed some simple activities for a brief session, focussing on speculative approaches to AS Security. Rather than dealing with the here and now, which is already overflowing with complexities, speculative approaches give us time to imagine, feel, and tell stories about new and uncertain territories of experience. What emerged from the short time we had with the TAS-S stakeholders, is that we are barely scratching the surface of all the interrelated challenges and opportunities presented by Autonomous Systems.
This laid the groundwork for a second, longer, workshop with the National Cyber Security Centre, where I helped design more in depth speculative activities around ELSI and Autonomous Drones. The response and ensuing conversations were really positive, and once again touched on the need for collaboration in making Autonomous Systems secure, ethical, legal, and trustworthy.
More workshops and engagements are being planned in RS3 right now, and we’re excited to share these with you very soon.
For more information about RS3, please visit our Research page.