• The Autonomous Weapons Newsletter
  • Posts
  • Unpacking submissions to the UN SG's report (we've picked our favourites), Pope Francis wants an AWS ban, what we're reading, recent headlines, and more.

Unpacking submissions to the UN SG's report (we've picked our favourites), Pope Francis wants an AWS ban, what we're reading, recent headlines, and more.

This is Anna Hehir, FLI’s Lead on Military AI, and Maggie Munro, Communications Strategist, here with the third edition of The Autonomous Weapons Newsletter. We’re excited to bring you the news on autonomous weapons systems (AWS) at a pivotal moment, as the world comes to terms with whether algorithms should make life and death decisions (spoiler alert: most people are terrified).

With this monthly publication, we’re keeping our audience - primarily consisting of policymakers, journalists, and diplomats - up-to-date on the autonomous weapons space, covering policymaking efforts, weapons systems technology, and more.

That being said, if you have no idea what we’re talking about, check out our starter guide on the topic.

If you’ve enjoyed reading this, please be sure to subscribe and share as widely as possible.

Unpacking submissions to the UN Secretary-General’s report

States, civil society, and stakeholders interested in AWS regulation have been busy doing their homework, writing submissions to be compiled into a report produced by the UN Secretary-General. The report itself will be published in late July (some pessimists think it’ll be mid-August), but that hasn’t stopped us from taking a look. Let’s dive in to some of our favourites:

Serbia has come out of the woodwork in support of “strict restrictions and rigorous monitoring” of AWS. Not holding back, Serbia described the use of AWS as “absolutely senseless”. “Death and destruction are the only guarantee” in the cases of software and hardware anomalies, and hacking.

In line with the ICRC and the UN SG, Fiji supports a prohibition on autonomous weapons that would target people. Fiji’s submission highlights the environmental and climate impact of autonomous weapons, pointing to the carbon cost of training algorithms. Fiji also supports the UN General Assembly as the ideal forum for negotiations, declaring that it “is time to step outside of the CCW to one that can aim higher, move faster, and be more inclusive of countries that are not party to the CCW as well as of international organisations and civil society”.

Ireland has become the 120th state to support a legally-binding instrument. Ireland maintains a “human-centred approach”, joins calls for multilateral rules, and supports parallel initiatives to the CCW that “facilitate an inclusive, global approach”.

Egypt strongly supports a legally-binding instrument and believes that the General Assembly would be the best forum to take forward the discussions that have taken place in the CCW Group of Governmental Experts. Egypt finds it “regrettable that progress [in the GGE] remains quite minimal, and that no tangible results have been reached yet.” Egypt underscored its position based on the unethical nature of delegating the decision to take a human life to a machine: “Even if an algorithm can be programmed to determine what is ‘legal’ under IHL, it can never be programmed to determine what is ‘ethical’.”

Sri Lanka supports a legally-binding instrument and deems non-binding voluntary measures insufficient to address the serious legal, ethical, and security challenges posed by AWS. Sri Lanka also believes that “the issue of a specific technical definition on AWS should not stand in the way of commencement of negotiations” for a treaty. Sri Lanka highlighted the security impacts of AWS such as asymmetric warfare, escalation, proliferation, mass destruction, and destabilisation.

Endorsements of the Austrian Chair’s Summary on AWS

If you haven’t already, check out the summary from the first ever global conference on AWS, the recent Vienna Conference on Autonomous Weapons Systems: Humanity at the Crossroads.

If you represent a state and would like to associate with the summary, or if you have any questions, please reach out to [email protected] or any Austrian Permanent Mission or Embassy. If you’re not representing a state but want to help, do share the one-pager attached below with your Foreign Ministry and encourage them to associate.

Call to associate with Vienna chair's summary.pdf69.91 KB • PDF File

Austria is updating the online list on a rolling basis throughout the summer, so get your endorsement in by August 15!

What We’re Reading

We’ve thoroughly enjoyed a recent blog post from SIPRI’s Laura Bruun for the ICRC Humanitarian Law & Policy Blog, titled ‘Reinventing the wheel? Three lessons that the AWS debate can learn from existing arms control agreements’.

TLDR? Here are the three lessons

  1. A prohibition does not need to be grounded in a clearly defined class of weapons.

  2. Restrictions can be used to clarify what IHL requires in the specific context of AWS.

  3. If there is a will (and a need), two-tiered instruments can be grounded in concerns beyond IHL.

Overheard This Month

  • A single pixel is enough to confuse a bomber with a dog, a civilian with a combatant” - Zachary Kallenborn to DefenceNews.

  • When you think about proportionality, you can’t programme it into code” - Megha Arora, Palantir’s Responsible AI Product Lead at the Paris Conference on AI and Digital Ethics.

  • AI weapons to take away the risk… but not the guilt” - Declassified Australia on Matilda Byrne’s new article covering Australia’s resistance to AWS regulation.

  • You cannot export that decision-making to a machine or a computer or a software” - ICRC President Mirjana Spoljaric in EL PAÍS, on the “loss of human control and accountability” in employing autonomous weapons systems.

AWS in the News

Pontiff’s appeal to leaders: Addressing world leaders at the G7 summit for the first time ever, Pope Francis explicitly called for a ban on the use of autonomous weapons, stating “no machine should ever choose to take the life of a human being.”

Ex-Google CEO creates secret AI drone company: Forbes reports that former Google CEO Eric Schmidt has been “stealthily” creating White Stork, a military startup to develop mass-producible autonomous attack drones for use by Ukrainian forces. Schmidt has undertaken a recruiting blitz among Big Tech companies, government agencies, universities, and AI hackathons in the US, making White Stork an open secret in the drone community.

West Africa turns to regulation efforts: Sierra Leone Ambassador Lansana Gberie spoke with the Inter Press Service to explain why unregulated autonomous weapons pose risks to Africa, in particular West Africa.

“We are a vulnerable region… These dynamics benefit weapons manufacturers and draw important resources away from peacebuilding and sustainable development”.

Sierra Leone Ambassador Lansana Gberie

Dogs of war: Imagine the viral Boston Dynamics robot dog, but with a machine gun mounted on its back, and you have something close to China’s new Unitree-built “robot battle dog”. The robot dogs’ autonomy in identifying and/or shooting targets wasn’t clarified when displayed at the recent China-Cambodia joint military drills, but with a Chinese soldier stating they can replace “our [human] members to conduct reconnaissance and identify [the] enemy and strike the target,” we’ll also be watching to see how these are deployed.

From Sina Weibo via Cybernews

Replicator reveal: More details have been released about the US military’s Replicator Program, which aims to introduce thousands of autonomous systems by fall 2025. DefenseScoop reports on the first tranche of technology selected, including loitering munitions, counter-drone capabilities, unmanned surface vehicles, and interceptors.

Contact Us

For tips or feedback, don’t hesitate to reach out to us at [email protected].