Examining popular arguments against AI existential risk: a philosophical analysis

Files

Access status: Embargo until 2026-05-25 , s10676-025-09881-y.pdf (905.18 KB)

Publication date

2026-03

Authors

Swoboda, Torben
Uuk, Risto
Lauwaert, Lode
Rebera, Andrew P.
Oimann, Ann Katrien
Chomanski, Bartlomiej
Prunkl, CarinaISNI 000000047966241X

Editors

Advisors

Supervisors

Document Type

Article

License

taverne

Abstract

Concerns about artificial intelligence (AI) and its potential existential risks have garnered significant attention, with figures like Geoffrey Hinton and Dennis Hassabis advocating for robust safeguards against catastrophic outcomes. Prominent scholars, such as Nick Bostrom and Max Tegmark, have further advanced the discourse by exploring the long-term impacts of superintelligent AI. However, this existential risk narrative faces criticism, particularly in popular media, where scholars like Timnit Gebru, Melanie Mitchell, and Nick Clegg argue, among other things, that it distracts from pressing current issues. Despite extensive media coverage, skepticism toward the existential risk discourse has received limited rigorous treatment in academic literature. Addressing this imbalance, this paper reconstructs and evaluates three common arguments against the existential risk perspective: the Distraction Argument, the Argument from Human Frailty, and the Checkpoints for Intervention Argument. By systematically reconstructing and assessing these arguments, the paper aims to provide a foundation for more balanced academic discourse and further research on AI.

Keywords

AI risk skepticism, AI Safety, Artificial general intelligence, Existential risk, Superintelligence, Taverne, Computer Science Applications, Library and Information Sciences

Citation

Swoboda, T, Uuk, R, Lauwaert, L, Rebera, A P, Oimann, A K, Chomanski, B & Prunkl, C 2026, 'Examining popular arguments against AI existential risk : a philosophical analysis', Ethics and Information Technology, vol. 28, no. 1, 7. https://doi.org/10.1007/s10676-025-09881-y