New UN talks; Google deletes its history; civilian drone casualties in Ukraine; and more.

This is Anna Hehir, FLI’s Head of Military AI, and Maggie Munro, Communications Strategist, here with the eighth edition of The Autonomous Weapons Newsletter. We’re excited to bring you the news on autonomous weapons systems (AWS) at a pivotal moment, as the world comes to terms with whether algorithms should make life and death decisions (spoiler alert: most people are terrified).

With this publication, we’re keeping our audience - primarily consisting of policymakers, journalists, and diplomats - up-to-date on the autonomous weapons space, covering policymaking efforts, weapons systems technology, and more.

That being said, if you have no idea what we’re talking about, check out autonomousweapons.org for our starter guide on the topic.

If you’ve enjoyed reading this, please be sure to subscribe and share as widely as possible.

📣 New UN talks on autonomous weapons in New York 📣

The beginning of 2025 has been jaw-dropping for those who have any stakes in multilateralism, the rules-based order, and the system of relationships carefully built over decades that underpins global stability and security. Whilst it’s difficult to fully analyse and make sense of such fast-changing dynamics, what is clear is that we’re in a period of history where the decisions of states and international institutions matter more than ever.

Which brings us to one particular ray of hope. Whilst tech companies make the most of the current window of zero regulation towards military AI, states continue to move closer towards a treaty on autonomous weapons systems. On 12-13 May, states will meet at the UN in New York to continue urgent discussions that bring us closer to the opening of negotiations of a treaty on autonomous weapons.

You may recall our coverage in the last edition of the UN resolution put forth by Austria at the UN General Assembly, which passed with 166 votes in favour. The resolution mandates the UN to convene a two-day meeting of states, observers, civil society, and experts in New York to discuss the urgent need for legally-binding rules over algorithms designed to kill.

If you would like to attend the UN talks in New York, see below:

  • As a representative of a state, please RSVP to the UN ODA Science, Technology and International Security Unit.

  • If you’re a member of an international organisation, civil society, the scientific community or industry, keep an eye out on this page for further information closer to the date.

  • If you’re a journalist or from the media, guidelines for media participation will also be published shortly by UN ODA.

We’ll be sure to see you there! 🍎

Google deletes their search principles history

Google quietly erased their “applications we will not pursue” section from their public AI principles page earlier this month, notably reversing their pledge to not build AI for weaponry or surveillance.

The company had previously been met with employee protests for providing the IDF and U.S. military with some cloud and AI services, which Google claimed was not used for harm against humans. This new development, however, is the first explicit move away from their previous promise to not build AI used to harm, or even kill, humans.

Adding to our disappointment is the fact that, of the six prominent AI companies reviewed by an independent panel for our 2024 AI safety scorecard, Google DeepMind was the only company - at the time - who were still holding out on developing AI weapons.

What We’re Reading

📚 Austrian Amb. Alexander Kmentt published “Geopolitics and the Regulation of Autonomous Weapons Systems” in Arms Control Today, on the urgent need for international cooperation to restrict autonomous weapons systems, the geopolitical obstacles hindering such cooperation, and what the road ahead looks like.

📚 In the Washington Post, a look at the movement to push AI-enabled weapons, led by defense tech companies such as Anduril, as a new way for the U.S. military to wage war.

Overheard This Month

  • We’ve got to have rules about these drones, these lethal autonomous weapons.” - Tom Fletcher, UN Under-Secretary-General for Humanitarian Affairs and former Foreign Affairs Advisor to British Prime Ministers Gordon Brown, Tony Blair, and David Cameron.

  • This [Trump] administration cares about weapon systems and business systems and not ‘technologies’… We're not going to be investing in ‘artificial intelligence’ because I don’t know what that means. We're going to invest in autonomous killer robots.” - An anonymous senior U.S. defense official on the Trump Administration’s Pentagon pivot from research to “usable arms and gear”.

  • This is the first confirmation we have gotten that commercial AI models are directly being used in warfare.” - Heidy Khlaaf, chief AI scientist of the AI Now Institute, on American tech companies supplying AI models to the IDF.

  • I've been on record saying I don't think we should have lethal autonomous weapons, but some countries are building them… That's just a reality.” - Google DeepMind CEO Demis Hassabis during an interview about Google’s aforementioned policy shift on military AI.

AWS in the News

 Ukrainian flight and might: Short-range drones caused more civilian casualties in Ukraine than any other weapon last month, according to the UN Human Rights Monitoring Mission in Ukraine (HRMMU) - in some areas accounting for 70% of civilian casualties. These first-person-view (FPV) drones, equipped with real-time cameras, have increasingly targeted civilians in vehicles, public transport, and even private spaces, despite their ability in principle to distinguish between military and civilian targets.

A survivor of an attack in Mykolaiv on 9 January told HRMMU how a small drone circled above his head before diving directly at him while he was working in his home’s garden.

 Austria takes the arms (control) race gold: We’re thrilled that Austrian Foreign Minister Alexander Schallenberg and the Austrian Foreign Ministry were selected as the 2024 “Arms Control Persons of the Year” through the Arms Control Association’s voting process! Schallenberg and his Ministry have spearheaded an international push to restrict autonomous weapons, including convening the 2024 Vienna Conference and advancing the aforementioned UNGA resolution to negotiate a ban.

 Informing policy, protecting humanity: This week is the last week for scientists to apply into the open call for selection by the UN to serve on the new Independent Scientific Panel on Nuclear War Effects, with the deadline this Saturday 1 March. This is a distinguished position, and successful applicants will have the opportunity to possibly influence the course of arms control over the next decade or more.

Contact Us

For tips or feedback, don’t hesitate to reach out to us at [email protected].