Salary: £34,433 – £35,614 per annum FTE (dependent upon experience)
Hours: 35 hours per week (part time working also considered)
Contract: To start early February 2021, fixed term until 31 July 2021
Closing date: 12:00 pm midday, Thursday 14 January 2021
The Ada Lovelace Institute is hiring a researcher for a project evaluating methods of algorithm auditing and impact assessment. This position provides a unique opportunity to develop cutting-edge research on the UK Government’s approach to algorithm auditing and impact assessments. Working in tandem with our Senior Researcher on Algorithm Accountability and a technical adviser, this position will work on a six-month project with access to a public agency in the UK who are seeking to implement an auditing and impact assessment process. This position offers an unprecedented opportunity to turn principles of assessment into practice.
Ideal candidates will have:
- Experience developing and implementing audits or assessments of algorithms and datasets in either the public or private sector
- Qualitative interviewing and analysis skills
- Excellent project management skills
- Expertise in technical methods for assessing an algorithm.
You will be responsible for:
- Creating a literature review for auditing and impact assessment methods that builds on our Examining the Black Box project
- Co-leading qualitative interviews with experts and public agency officials
- Developing recommendations for auditing and assessing models and datasets
- Co-authoring a report on the research via the Ada Lovelace Institute website.
You may have a background working in the tech industry, or researching and coordinating for an academic organisation, research institute or community charity. You may have a university degree, or have gained experience from an apprenticeship, trainee programme, bootcamp or on the job. You are curious and passionate about the issues which arise at the intersection of technology and society, and are committed to bringing an interdisciplinary and intersectional lens to understanding them. You’ll be comfortable taking the initiative, working independently and to short deadlines at times. You’ll enjoy working in a team environment, be willing to jump into projects and keen to explore areas of policy, technology and practice that you don’t already understand. You’ll appreciate the importance of exceptionally high standards of rigour in research, but also want to think creatively about communicating and influencing in novel ways.
About the Ada Lovelace Institute
The Ada Lovelace Institute is an independent research institute and deliberative body funded and incubated by the Nuffield Foundation in 2018. Our mission is to ensure data and artificial intelligence work for people and society. We do this by building evidence and fostering rigorous debate on how data and AI affect people and society. We recognise the power asymmetries that exist in ethical and legal debates around the development of data-driven technologies and seek to level those asymmetries by convening diverse voices and creating a shared understanding of the ethical issues arising from data and AI. Finally, we seek to define and inform good practice in the design and deployment of AI technologies.
After little more than a year of operation, the Institute has emerged as a leading independent voice on the ethical and societal impacts of data and AI. We have built relationships in the public, private and civil society sectors in the UK and internationally. Some of our most impactful work to date includes our rapid evidence review on contact tracing apps, Exit Through the App Store?, and our public attitudes survey on facial recognition, Beyond Face Value. Our research broadly focuses on four pillars:
- Data for the Public Good: evaluating and understanding the social value of data; promoting data stewardship; advocating for data rights and regulation, and addressing issues of data injustice.
- Algorithm Accountability: understanding how algorithmic systems are changing the delivery of public services; exploring mechanisms for auditing and assessing algorithmic; developing new methods to ensure algorithms are transparent and accountable to those affected by them.
- Justice and Equity: understanding how data and AI interact with identity, race and ethnicity; identifying mechanisms for preventing the inequitable and discriminatory impact of data driven technologies in domains such as health, education, and criminal justice.
- COVID & Health: exploring AI, data, and healthcare, particularly digital and technical responses to the COVID pandemic such as contact tracing apps and vaccine certification schemes.
Our research takes an interconnected approach to issues such as power, social justice, distributional impact and climate change (read our strategy to find out more), and our team have a wide range of expertise that cuts across policy, technology, academia, industry, law and human rights. We value diversity in backgrounds, skills, perspectives and life experiences. Because we are part of the Nuffield Foundation, we are a small team with the practical support of an established organisation that cares for its employees.
The Ada Lovelace Institute values diversity, equity and inclusion in our workplace. Our work and culture are strengthened by our differences in experience, national origin, religion, culture, sexual orientation, and other backgrounds. We welcome applications from people of colour, women, the LGBTQIA community, people with disabilities, and people who identify with other traditionally underrepresented and minoritised backgrounds.
We aim to be a collaborative, welcome and informal place to work. Before COVID-19 the team worked flexibly, with some working from home regularly or on an ad hoc basis. We now operate fully remotely, using collaborative working tools such as Microsoft Teams with regular video calls). We are currently a 12-person team and expect to return to some in-person working in 2021 (and will have a shiny new office in Farringdon in early 2021), but we are open to staff working remotely for the foreseeable future, including in UK geographical locations outside of London.
How to apply
To apply, please click through to our online portal where you can submit your CV and covering letter explaining how your experience matches the requirements detailed in the person specification.
The closing date for applications is 12:00pm midday GMT on 14 January 2021, with interviews taking place via video the week of the 18 January 2021.
If you are from a background that is underrepresented in the sector (for example you are from a community of colour, did not go to university or had free school meals as a child), and you would like to discuss how your experience may be transferable to this role, you can book time with one of our team who will be pleased to have a chat with you. Please note that this person will not be involved in the recruitment process. You can request this by emailing firstname.lastname@example.org (and we will not ask you to disclose your background).
Can transparency bring accountability to public-sector algorithmic decision-making (ADM) systems?
A look at transparency mechanisms that should be in place to enable us to scrutinise and challenge algorithmic decision-making systems.
The failure of the A-level algorithm highlights the need for a more transparent, accountable and inclusive process in the deployment of algorithms.
The Ada Lovelace Institute is hiring a researcher for our Algorithm accountability research pillar.