“If we’re looking for that one terminator to show up at our door, we’re maybe looking in the wrong place,” says Matt Mahmoudi, Amnesty International artificial intelligence researcher. “What we’re actually needing to keep an eye out for are these more mundane ways in which these technologies are starting to play a role in our everyday lives.”
Laura Nolan, a software engineer and a former Google employee now with the International Committee for Robot Arms Control, agrees. “These kinds of weapons, they’re very intimately bound up in surveillance technologies,” she says of lethal autonomous weapons systems or LAWS.
Beyond surveillance, Nolan warns that: “Taking the logic of what we’re doing in warfare or in our society, and we start encoding it in algorithms and processes … can lead to things spinning out of control.”
But Mahmoudi, says there is hope for banning autonomous weapons, citing existing protections against the use of chemical and biological weapons. “It’s never too late, but we have to put human beings and not data points ahead of the agenda.”
On UpFront, Marc Lamont Hill discusses the risks behind autonomous weapons with the International Committee for Robot Arms Control’s Laura Nolan and Amnesty International’s Matt Mahmoudi.