menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Why we must keep humans at the heart of AI in warfare

5 0
29.07.2025

The past decade has seen rapid growth in the global development and use of artificial intelligence (AI), and the UK Ministry of Defence (MoD) has been no exception.

Much of the international legal discussion around regulating military AI, particularly Lethal Autonomous Weapons Systems (LAWS), which can select and attack targets without further human intervention, has focused on the importance of human judgement and control in military decision-making.

Since 2016, discussions of the Convention on Certain Conventional Weapons Group of Governmental Experts on LAWS have been ongoing, but International Humanitarian Law (IHL) still lacks any specific, binding regulations relating to AI. As noted by International Committee of the Red Cross (ICRC) President Mirjana Spoljaric, AI in war is “no longer an issue for tomorrow”, but rather “an urgent humanitarian priority today”, requiring the immediate “negotiation of new legally binding international rules”. Accordingly, United Nations Secretary General António Guterres recommended, in his 2023 New Agenda for Peace, that “a legally binding instrument” to prohibit and/or regulate AI weapons be concluded by 2026.

Read more

The ICRC has stressed that responsibility in warfare must remain with humans. “Human control must be maintained,” it argues, and limits on autonomy urgently established “to ensure compliance with international law and to satisfy ethical concerns”.

In 2022, the MoD itself echoed this sentiment. It stated that only human soldiers “can make instinctive decisions on the ground in a conflict zone; improvise on rescue missions during natural disasters; or offer empathy and sympathy.” The........

© Herald Scotland