Unit for the Ethics of Technology

Head of Unit: Dr Tanya de Villiers-Botha

tdev@sun.ac.za

Dr. Tanya de Villiers-Botha is a Senior Lecturer in the Department of Philosophy and Head of the Unit for the Ethics of Technology of the Centre for Applied Ethics. She holds an MA and DPhil in Philosophy from Stellenbosch University. Her research specialisations include philosophy of cognitive science, meta-ethics, and applied ethics, with a specific focus on digital ethics.

Full profile

Fabio Tollon | Research Associate

ftollon@ed.ac.uk

Fabio is a philosopher of technology with interests in the ethics of AI, moral responsibility, and free will. He is a postdoctoral researcher part of the BRAID (Bridging Responsible AI Divides) program at the University of Edinburgh. He is also a research fellow at the unit for the ethics of technology at Stellenbosch university and a research associate at the Centre for Artificial Intelligence Research (CAIR) at the university of Pretoria. Fabio’s work has been published in journals such as Ethics and Information Technology, European Journal of Analytic Philosophy, and AI & Society.

Website

Niël Conradie | Research Associate

niel.conradie@humtec.rwth-aachen.de

Dr Niël Conradie is a postdoctoral researcher working in the Applied Ethics Group of the Department of Society, Technology, and Human Factors at RWTH Aachen University, in Germany. He earned his PhD in philosophy in 2017, focussed on the intersection of responsibility and action theory, at the University of St Andrews, Scotland. Before this he received his MA in philosophy and BA at the University of Stellenbosch, South Africa. His research interests encompass topics from Action Theory, Responsibility, Ethics of Technology, and AI Ethics.

During his time in Aachen he has participated in the BMBF-funded InviDas project, investigating the ethical dimensions relevant to wearable technologies, with a particular focus on the emerging class of smart wearables and their impact on the decisional autonomy and digital sovereignty of users. The project is interdisciplinary, involving partners from several technical fields as well as representation from Garmin Ltd as a corporate partner. The outcomes of this work have been presented at several conferences and published in relevant journals. With the InviDas project coming to a close in 2023, he will soon be joining the Bridging AI project. Beyond these projects, he is focused on questions of collective responsibility and how this relates to AI and other emergent technologies. Many such technologies are ever more autonomous, a fact that introduces concerns regarding the appropriate tracking of responsibility when their use results in morally significant harms. Nowhere is this more pertinently than in the context of military applications, where these techno-responsibility gaps have received substantial discussion. Niël seeks to mitigate these concerns by arguing for the inclusion of collective responsibility to our toolkit of responsibility practices in dealing with highly autonomous technologies embedded within structured decision-making procedures.

Publications

  • Tigard, D. W., Conradie, N. H. and Nagel, S. K. 2019. Socially responsive technologies: toward a co-developmental path. AI & Society. 35: 885–893.
  • Conradie, N. H. 2020. Enhancing Responsibility and Responsible Enhancement: moral bioenhancement and the actual-sequence account of moral responsibility. Ramon Llull Journal of Applied Ethics. 11: 331-350.
  • Butting, A., Conradie, N. H., Croll, J., Fehler, M., Gruber, C., Herrmann, D., Mertens, A., Michael, J., Nitsch, V., Nagel, S., Pütz, S., Rumpe, B., Schauermann, E., Schöning, J., Stellmacher, C., and Theis, S. 2020. Souveräne digitalrechtliche Entscheidungsfindung hinsichtlich der Datenpreisgabe bei der Nutzung von Wearables. Forum Privatheit 2020.
  • Conradie, N., Theis, S., Croll, J., Gruber, C., and Nagel, S. (2022): “The impact of smart wearables on the decisional autonomy of vulnerable persons.” Künstliche Intelligenz, Demokratie und Privatheit. Nomos Verlagsgesellschaft mbH & Co. KG.
  • Conradie, N. H. and Nagel, S. K. 2022. Digital sovereignty and smart wearables: Three moral calculi for the distribution of legitimate control over the digital. Journal of Responsible Technology. 12. https://doi.org/10.1016/j.jrt.2022.100053.

Lize Alberts | Research Associate

lize.alberts@keble.ox.ac.uk

Lize is a doctoral candidate in Computer Science at the University of Oxford in the Human Centred Computing group. Her work is funded by a graduate scholarship from the Responsible Technology Institute. Her research centres on improving how automated systems treat people in interactions, particularly ones that behave as social actors. Her work involves exploring what it means to treat users in ways that are respectful and supportive of their autonomy, as well as identifying dark patterns in interfaces that utilise social cues to manipulate user behaviour.

Lize has an MA by Thesis degree in Philosophy from Stellenbosch University, focusing on embodied cognition and computational linguistics. Her master’s thesis combines a theoretical review of work in cognitive science and philosophy of mind/language with a review of current technical approaches in natural language processing (NLP), to distinguish between different senses of language ‘understanding’. She also holds a BA Honours degree in Philosophy from Stellenbosch University, and her dissertation was published in a peer-reviewed journal. Preceding that, she graduated top of her class with a BA in Humanities, obtaining double the required credits with four majors (Philosophy, English, Social Anthropology, History of Art) at North-West University, Pochefstroom.

Lize is a founding member of the Responsible Technology Institute’s Student Network based at the University of Oxford, and is currently working as a student researcher at Google, London.