When you start getting interested in nuclear disarmament, you quickly realise that the issue cannot be viewed in isolation from other disarmament campaigns. The humanitarian movement for nuclear disarmament draws its strategy and values directly from past disarmament campaigns, and it takes its place alongside other weapons campaigns currently working to curb the brute excesses of war. For me, after nuclear weapons, I became interested in landmines and cluster munitions, (it sounds odd writing that!) since these successful campaigns inform the work of the International Campaign to Abolish Nuclear Weapons. Next it made sense to learn about chemical and biological weapons, as these influenced the Nuclear Weapons Convention and are the other two weapons of mass destruction. The Arms Trade Treaty is another recent success; although not a disarmament treaty, the ATT regulates the trade in conventional weapons (e.g. weapons should be not sold to those likely to commit human rights abuses). Then there’s the killer robots campaign: a (scarily real) contemporary campaign which aims to pre-emptively prohibit the use and stymie the development of fully autonomous weapons. This last campaign featured in a panel discussion held yesterday at Saint Andrew’s on the Terrace.
The panel of Marnie Lloyd (New Zealand Red Cross), Thomas Nash (Article 36 United Kingdom) and Edwina Hughes (Peace Movement Aotearoa New Zealand) tackled the big stuff: nuclear weapons, explosive devices and killer robots. The subject matter was vast and the questions afterwards further expanded the ambit of the conversation. Although this proved challenging at times, I nevertheless found the discussion illuminating and its ethical questions profoundly unsettling.
So what’s the deal with explosive weapons?
“Explosive weapons” is a broad term which can refer to weapons like bombs, mortars, landmines and improvised explosive devices – basically anything that can be detonated over towns and cities. Marnie explained that the ICRC (International Committee of the Red Cross) is concerned that these weapons disproportionately impact civilians. Explosions can destroy schools and hospitals and take the lives of thousands of innocent men, women and children. Destroying civilian infrastructure has ripple effects across a nation’s population. For instance, if a power plant is destroyed, hospitals cannot function, water supplies cannot be activated, and civilians may become internally displaced. This creates a humanitarian nightmare. It is possible to bring explosive weapons into compliance with international humanitarian law; the problem for the ICRC is not the weapons themselves, but the way these weapons are being used. (Reminiscent of the "it's not the drinking..." ads!)
The ICRC has taken a number of affirmative actions over the last five years, such as releasing written statements and resolutions calling on states to protect civilians, and facilitating a meeting of state representatives and independent experts to share information on the topic.
According to Thomas, people around the world have been strangely ambivalent towards the massive loss of civilian life in countries like Syria, Yemen and the Sudan in recent years. In his words, there has been a “moral outrage gap.” In fact, it is unacceptable that it is seen as inevitable that whole towns should be bombarded to satisfy a purported military aim.
The International Network on Explosive Weapons (INEW) which Thomas coordinates has the support of organisations such as Save the Children, Oxfam, Human Rights Watch and Women’s International League for Peace and Freedom. The UN Secretary-General has voiced his support of the goal. INEW wants states to acknowledge the problem, adopt more transparent measures in discussing existing national policies and endorse a political statement laying out a reasonable position on the issue.
What about killer robots?
This campaign is in its infancy in disarmament terms. Edwina told us that the Campaign to Stop Killer Robots was established in 2013, following the precedent of the campaign to ban blinding lasers (these weapons were banned when the technology was still being developed, as they were considered too ghastly to ever use.) Killer robots, or fully autonomous weapons, are discussed in forums such as the Convention on Certain Convention Weapons (CCW: for once the acronym is preferable to the clunky full name!) and the UN Human Rights Council. For Edwina, the idea of robots taking over the command from humans in conflict undermines the whole idea of international humanitarian law, since IHL presupposes that humans make decisions about the fate of other human beings before them. Edwina also expressed her discontent with the New Zealand government’s stance on both killer robots (Mary Wareham, a New Zealander working for Human Rights Watch echoes this sentiment) and nuclear weapons. Edwina is concerned that the government has stepped away from the helm to take a backseat on these two disarmament issues.
Now to digress… When people talk about killer robots, they invariably make reference to sci-fi films like Terminator. I was a booklover long before I became a cinephile, and for years my films of choice were foreign languages and documentaries… so my affinity with sci-fi blockbusters is sadly lacking. But if it counts, I have recently seen Her and Ex Machina, and what I took away from these films (unfortunately) was that if you want to make big bucks with an AI film, you should ensure to centre your plot around the eternal themes of sex and violence. In both movies, hapless male characters fall in love with robots / operating systems, or at least entertain fancies of copulating with them. (What is this rumour about women being emotional and irrational by the way?) In Her, operating system Samantha fools Theodore into believing she only cares for him, whilst systematically manipulating hundreds of other humans. In Ex Machina, it is only a matter of time before robot Ava (complete with seductive husky voice) learns of her inevitable dismantlement and plans to escape, on the way seeking revenge on programmer Nathan who had locked her up. However much these films may promise to expand our wary minds with surprising themes, they invariable fall back on the old winners of sex and violence.
Which might arguably be fine for entertainment value if it weren’t for the realisation that this is how things actually play out. (Let’s leave the ‘does art mimic, or create, reality?’ discussion for another time…) What’s leading the worldwide robotics and AI movement? The noble desire to replace humans with machines in dangerous occupations like underground mining? Um maybe, but probably not. My research method here is totally lacking in rigour, but do a quick search of warbots and sexbots on the internet and you’ll quickly get the idea that, sad as it might be, sex and violence are paving the way yet again in this technological development. (See “Sex, Bombs and Burgers” in my blog archives for more on this subject.)
So when we think about the future of robotics, it is saddening yet hardly surprising that a large part of human energy is being directed towards creating robots (warbots) for the battlefield. The idea is that fully autonomous machines might take over the controls from humans, that is to say, no human intervention would be needed once the warbots were programmed to undertake a euphemistically-termed ‘mission.’ We already see examples of partly autonomous weapons being used to wage war; drones used by the military in Pakistan and directed by people in the US are a prime example. The development of warbots is intrinsically linked to the development of other, nominally peaceful uses of technology. It is no coincidence that warbots are developing at the same time as driverless cars. Is it mere coincidence that drone targeting ability is improving at the same time as Facebook gets smarter recognising our friends in tagged photos?
The interconnection of technology and warfare presents peace campaigners and supporters of international humanitarian law with a challenge. The sheer monolithic size (in terms of political power, finances and reach) of the techno-weapons industry makes it enormously difficult to defeat (to employ a military term); however, given that everyday technologies may be linked to the weapons industry, there may be creative ways to engage the general public on these issues. If you could (hypothetically) prove a connection between a certain brand of car and investment in autonomous weapons, why not create a campaign urging public citizens to never buy this particular model and to take to social media to shame the manufacturers?
Finally… a note on the nuclear weapons front!