There is a growing expectation that technologies and algorithms should align with, and reflect, commonly held public and societal values. But to make this happen, society needs to be kept ‘in the loop’. Do we need an implicit social contract between those developing and designing the tech and those who may be affected by it?
‘Society in the loop’ has been defined by MIT Professor Iyad Rahman as ‘embedding the judgement of society, as a whole, in the algorithmic governance of societal outcomes.’ In doing so, Rahman identifies that there is a need for wider societal engagement instances where machine learning and AI directly impact on our society and our economy. But how can we best enable society to influence and shape decisions about technology and innovation?
Beyond making the normative case for ‘society in the loop’, there is considerable evidence that enabling society to help influence and shape decisions about technology and innovation can help create better outcomes. It can safeguard against technocratic groupthink that fails to account for or mitigate against that which may cause social harm. For instance, research undertaken by Professor Scott E Page at the University of Michigan has found that more diverse groups outperformed groups of intelligent (but likeminded) individuals in solving complex problems. The reason being that diverse groups were more likely to break through challenges than the groups of smart people who tended to think similarly. In addition, this approach can help ensure that technology itself is more in tune with public and societal values and can strengthen the ‘cultural competence’ of technology in a world where diverse peoples are increasingly interconnected – often by technology.
The notion of ‘society in the loop’ was the focus of a conference hosted by Civil Society Futures, Doteveryone and the Ada Lovelace Institute. The event convened over 30 social scientists, technologists and policymakers as speakers for a day of challenge, collaboration and conversation. They were joined by just over 100 delegates.
The Society in the Loop conference sparked a range of valuable perspectives and thoughts, but there were three identifiable cross-cutting themes that emerged across all discussions:
Insight 1: Building a shared language and interdisciplinary dialogue around the key issues affecting society is essential
When it comes to data and AI, lazy use of language risks excluding groups within society from the debate, as well as presenting barriers to creating a shared language and consensus.
- The public debate often confines dialogue about the impact of technology to the language of ‘ethics’ rather than the broader language of societal rights and responsibility.
- There is a ‘fundamental disconnect’ between the language used in technology, and the language used by the social sector (Annika Small OBE)
- The public debate is often at risk of adopting the narrative of the ‘passive consumer’ rather than the narrative of ‘active citizens’, or the language of ‘community’, ‘solidarity’ and ‘society’ (Professor Evelyn Ruppert)
The notion of what ‘meaningful’ consent might be could be interpreted differently for distinct organisations and individuals. Similarly, the concept of ‘explainability’ appears itself to be subject to considerable debate about what precisely the threshold for ‘adequate’ explainability might be. Other notions that are freely used in discussion and debate about AI, data and ethics – such as bias, fairness and trust – seem to be similarly slippery concepts.
The use of technical – and often inaccessible – language can inadvertently exclude those who deserve and need to be at the table. If we want to keep society ‘in the loop’ we need to use language purposefully to try and bridge gaps between diverse communities.
Insight 2: Start with the vision of the society we wish to create, and then ensure technology reflects that vision
Hetan Shah, Chief Executive of the Royal Statistical Society argued that a helpful starting point is an articulation of the kind of society you wish to create. Concern that technology is creating a less equal and less inclusive society says more, perhaps, about our failure to pursue a shared vision for that society.
There were a number of suggestions for how a clearer vision for society might be articulated, including:
- Building upon the norms and rules that we have already agreed matter (such as the rule of law, to use an example given by the Law Society’s Sophia Adams Bhatti) to be effective in a changing world.
- Balancing expertise and technical insight about the best way forward with the democratic demands of wider society.
- Balancing effective experimentation, innovation and creativity that has potential to do enormous good with ‘safety-first’ priorities that prevent public ‘tech-lash’ against innovation.
In particular, it is widely acknowledged that people and society want to be part of the conversation and are not often as apathetic about the issues as is generally perceived. However, often publics struggle to know how best to engage with or influence the system; and can struggle to understand exactly how the issues directly affect them. As part of articulating a vision for society, it seems important to have an agenda to enable more effective inclusion and literacy.
Insight 3: There is a need for greater accountability (‘lions under the internet’), but also greater clarity about what this means
The term ‘accountability’ was one of the most widely used and discussed throughout the conference – in our law & justice discussion Ravi Naik, award-winning human rights lawyer, issued a call for ‘lions under the internet’; analogous to the way judges have been ‘lions under the throne’. Implicit within this notion is the sense that power is held and exercised in a way that demands checks and balances, good governance and responsiveness on the part of those making decisions (including companies, technologists and policymakers) towards both citizens and wider society at large.
- Suggestions for strengthening accountability included:
- A rethink of good governance by companies themselves to ensure that they are effectively representing societal interests by having e.g Chief Ethical Officers on their company boards (Kriti Sharma, Vice President of Sage Group).
- The importance of technologists engaging with ethics and societal questions throughout their professional training.
- Requiring 20% of technologists to gain a better understanding of charitable and social sector needs and constraints by sitting on charity boardrooms, or otherwise contributing towards social purpose outcomes – e.g volunteering for organisations such as DataKind UK (Annika Small OBE, CAST).
- Ensuring that technology itself was responsive to wider societal needs through its design and development (a number of strong examples were provided by Projects by If – such as the new Data Rights Finder).
- Ensuring collective ownership of data – for instance, Sophia Varlow of the Commons Platform argued for the creation of new collectively owned data platforms.
It is important to note that technology is not developing neutrally and we must ensure that broad accountability does not reinforce existing institutional or social inequalities. For example, Karen Croxson at the Financial Conduct Authority warned that by making consumers more predictable, data may risk making them more exploitable.
The emergence of the term ‘society in the loop’ reflects the growing need for an implicit social contract between those developing and designing tech and those affected by it.
However, this is no small ask because we do not have a homogeneous public (rather, we have publics), a homogeneous civil society, or even, homogeneous sets of power holders. One example explored during the conference was the difficulty of navigating the tension between open access and the open data movement and consent-based models
Ensuring that society is ‘in the loop’ requires more than just considering societal and public perspectives on an equal parity with ‘expert’ input. It also means addressing and understanding tensions between the wills of diverse groupings and peoples – which requires consensus building, and/or the negotiation of economic and societal trade-offs.
Report with recommendations and findings of a public deliberation on biometrics technology, policy and governance
Examining how the commitment to responsible data in the UK's National Data Strategy could be realised and what it misses
A partnership with Traverse, the Geospatial Commission and Sciencewise to understand public perspectives on the responsible use of location data
Requirements that governments and developers will need to deliver in order for any vaccine passport system to deliver societal benefit