Algorithm Audit: Why, What, and How?
Seeking to increasing the social awareness of citizens, institutions, and corporations with regard the risks presented by the acritical use of algorithms in decision-making, this book explains the rationale and the methods of algorithm audit. Interdisciplinary in approach, it provides a systematic overview of the subject, supplying readers with clear definitions and practical tools for the audit of algorithms, while also taking account of the political, business and vocational obstacles to the development of this new field. As such, it constitutes an essential resource for students and researchers across the social sciences and humanities, as well as for professionals and policy makers, with concerns about the social consequences of algorithmic decision-making.
Biagio Aragona is an Associate Professor of Sociology at the University of Naples Federico II, Italy.
Routledge Advances in Research Methods
Researching Ageing
Methodological Challenges and their Empirical Background
Edited by Maria uszczyska
Diagramming the Social
Relational Method in Research
Russell Dudley-Smith and Natasha Whiteman
Participatory Case Study Work
Approaches, Authenticity and Application in Ageing Studies
Edited by Sion Williams and John Keady
Social Causation and Biographical Research
Philosophical, Theoretical and Methodological Arguments
Georgios Tsiolis and Michalis Christodoulou
Beyond Disciplinarity
Historical Evolutions of Research Epistemology
Catherine Hayes, John Fulton and Andrew Livingstone with Claire Todd, Stephen Capper and Peter Smith
Concept Analysis in Nursing
A New Approach
John Paley
Algorithm Audit: Why, What, and How?
Biagio Aragona
For more information about this series, please visit: www.routledge.com/Routledge-Advances-in-Research-Methods/book-series/RARM
First published 2022
by Routledge
2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN
and by Routledge
605 Third Avenue, New York, NY 10158
Routledge is an imprint of the Taylor & Francis Group, an informa business
2022 Biagio Aragona
The right of Biagio Aragona to be identified as author of this work has been asserted by him in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe.
British Library Cataloguing-in-Publication Data
A catalogue record for this book is available from the British Library
Library of Congress Cataloging-in-Publication Data
A catalog record has been requested for this book
ISBN: 9780367530914 (hbk)
ISBN: 9780367530921 (pbk)
ISBN: 9781003080381 (ebk)
DOI: 10.4324/9781003080381
Typeset in Times NR MT Pro
by KnowledgeWorks Global Ltd.
To Giulia, Guglielmo, Siria and Sveva, for all the times I have answered Excuse me, but I am working.
Contents
2What
3How
4Rights, politics, and education
- 2What
- 3How
- 4Rights, politics, and education
First and foremost, many thanks to Enrica Amaturo, for his guidance throughout my academic career, and the detailed readthrough and critique of the entire manuscript. I would like also to acknowledge Susan Halford for inspiring me to explore the applications of social research methods to algorithms and other data-intensive technologies. Cristiano Felaco was an essential sounding board and source of valuable materials.
The research conducted in writing this book was in part supported by the innovative projects fund of the University of Naples Federico II, my home, the place where I grow up as both a scholar, and a person.
DOI: 10.4324/9781003080381-101
It is not easy to introduce this book. There are so many examples of algorithms impacting our lives and societies that it is difficult to make a choice. Then, during my writing, in England, a further algorithm designed for the governance of education produced inequality by lowering the A-level results of nearly 36% of students who could not sit exams due to the coronavirus pandemic.
The algorithm was employed by the Office of Qualifications and Examinations Regulation (OFQUAL) a non-ministerial government department that regulates qualifications, exams, and tests in England to combat grade inflation and moderate the teacher-predicted grades for A level1 and General Certificate of Secondary Education (GCSE) qualifications in 2020. The use of the algorithm was justified because examinations were cancelled as part of the response to the COVID-19 pandemic. Of course, as it always happens when trying to automate administrative processes, the OFQUAL algorithm was developed with the best intentions, ensuring that qualification standards were maintained and that the distribution of grades followed that in previous years.
For A-level students, their school had already included a predicted grade as part of the Universities and Colleges Admissions Service (UCAS) application reference, which operates the application process for British universities. The UCAS application was submitted by 15th of January 2020 (15th of October 2019 for Oxford, Cambridge and medicine), and the grades had been shared with the students.
According to the UCAS application, Mithusan Thiagarajah, one A-level student at the high school in Surbiton, was offered a place to study medicine at Caius College, Cambridge. No one in Mithushans school had ever gone to Cambridge before, his success was not only making him proud but also his school.
The A-level grades were announced on 13th of August 2020. Nearly 36% were lower than the predicted grade (). Unfortunately, Mithushan, who had been expected to achieve four A*s by his teachers, also had his results downgraded to one A* and three As, not enough for Caius College. The college withdrew its offer.
What had happened?
OFQUALs algorithm is based on the record of each examination centre in the subject being assessed. Students grades at small schools or those taking minority subjects were computed differently than those of students at large schools. When in the examination centre students were over 15, the teacher-predicted grade (called centre assessed grade (CAG)) was standardised according to the series of the last three years results of the centre. When in the examination centre the students taking the test were under 15, CAGs were used without comparing with the historical data of the centres. As a result, students at small schools or who were taking minority subjects received grades higher than teacher predictions, while the opposite happened for students at large schools. The formula for standardising grades was actually copying the disparities that exist in the British education system. Small schools or minority subjects are typical of private schools, while large state schools have open access policies, and historically they have educated a large number of minority ethnic students and vulnerable students, who often have a lower grade distribution. Unfortunately, Mithusan and many other top students with his socio-demographic characteristics did not fit the time series of their examination centre.