London, United Kingdom – “First we pick up that there’s a planned protest,” said Barry Millet, an information security manager at Mitie, as he pointed at a computer screen streaming a small protest.
“Our guys will analyse it, look at the various feeds, and work out how many people are attending. What sort of actions have these groups taken in the past? Are they passive? Are they more direct action? Are they likely to try and gain entry into the building?” he continued, demonstrating his company’s monitoring software from a trade booth at London’s 2021 International Security Expo (ISE), at the Olympia exhibition centre.
At the recent event, everything from mobile phone trackers and electrified, motion-sensitive border fences to facial recognition software and nano-drones was on show for government and corporate buyers.
Mitie holds a string of United Kingdom contracts, providing security tools at immigrant detention centres and supermarkets.
At the expo, it presented a sprawling open-source data aggregator, using information on the protest posted on Twitter, Facebook and Instagram as well as real-time video from vloggers live-streaming the rally.
As UK policing becomes increasingly reliant on big data, several new bills set to expand police powers will boost real-time and retroactive data monitoring, as well as the use of facial recognition software and artificial intelligence (AI).
The most notable of these changes to UK policing is the Police, Crime, Sentencing and Courts Bill which reached a third reading in the House of Commons on July 5.
Monumental in its scope, the bill has the power to criminalise selected marches and protests, with breaches of the law carrying a maximum sentence of up to 10 years.
It also seeks to enhance stop and search powers on people who have previously committed violent offences.
‘Safest place in the world online’
The government claims its Draft Online Safety Bill will make Britain “the safest place in the world online” and is likely to rely heavily on AI-driven content moderation.
Martyn’s Law, following the Manchester Arena bombing in 2017 which killed 22 people, will require enhanced security monitoring and strategic planning for venues.
Much of the technology on show at the ISE is currently used by law enforcement agencies in the UK and around the world.
With Britain’s increased police budget for 2021-22, which will see funding for counterterrorism total 914 million pounds ($1.2bn), several of the security and policing solutions on display may begin to shift from being a last resort to a first response.
“One hour of CCTV video from a single camera’s footage can take a police officer between one to four hours to watch,” said Fariba Hozhabrafkan, chief commercial officer for SeeQuestor, a facial recognition software company that British police have contracted to use its AI for missing people, rape and murder cases.
“An incident could have six or seven hundred hours of CCTV footage,” she said.
SeeQuestor’s software allows real-time and recorded security camera footage to be uploaded into the programme where it performs facial recognition on people of interest, linking them to existing or live footage – as well as police and Home Office databases.
When it has found a potential match, the AI presents investigating officers with a probability score.
It can also perform targeted searches based on gender, race and clothing – and is trained to recognise inanimate objects such as guns, backpacks or abandoned suitcases in airports and train stations.
Despite government and company assurances, however, there are deep concerns over privacy regarding surveillance technology. Critics question its ethical uses, the storage of data, and the potential effect and biases of AI.
The London Metropolitan Police came under fire in early 2020 for the unannounced use of facial recognition, manufactured by Japanese tech firm NEC, outside the busy Oxford Circus station in central London. Human rights groups challenged the legality of its use.
Although it is claimed to make streets safer, researchers from the University of Essex commissioned by the Met found that the software not only failed 80 percent of the time but exhibited extreme racial bias.
Nevertheless, in August, a four-year contract was approved between the Met and NEC to implement retroactive facial recognition software across London.
“The market for surveillance is hurtling forward,” said Silkie Carlo, director of the privacy and civil liberties campaign group Big Brother Watch. “Different [UK] authorities keep buying and deploying [devices] with a high level of secrecy and often with a very shaky or non-existent legal basis.
“When live facial recognition is in use, you see people on watchlists for no good reason including activists, people with mental health problems, people who’ve not committed any crime whatsoever.”
But according to Sabrina Wagner of Videmo, a German facial recognition software company, despite any concerns, “[AI] usage is going to increase as the data is simply increasing.”
“Police these days don’t have terabytes – they have petabytes – of data that is being stored as evidence.”
Videmo’s software – which can identify partially obscured faces – was used by German police after the 2017 G20 protests in Hamburg and led to 15 arrests. Authorities then created a portal for people to upload videos they had shot of protesters and combined this information with video camera and open-source footage.
“I think it was 60 years’ worth of video material that they had to go through,” said Wagner.
Videmo uses the area around the eyes to create a match, making it of particular interest to law enforcement tracking protesters in balaclavas.
Videmo is now working on gait recognition technology. According to Wagner, early-stage trials indicate that it is possible to accurately identify an individual merely through their gait and limb proportions.
Carlo, the campaigner, warned: “In America, there have been people who’ve gone to jail over facial recognition because there’s the temptation for people to think it’s like DNA. People think that’s it’s highly, highly accurate. It’s not. And people can be wrongly flagged quite easily. Again, the problem is there’s no legal framework for that.”
The IMSI catcher, a piece of hardware with a history of use at protests, was also on show at the London event.
Known in the US as a Stingray, it intercepts, tracks and monitors mobile phones en masse, by tricking them into treating it as a phone mast.
Stingrays gained visibility in the United States for their use against Black Lives Matter protesters and are rumoured to have been used by the FBI for years.
They were also recently discovered in Mexico City, where giant fake antennas illegally surveilled phones.
British company Revector has managed to scale down the size of IMSI catchers and attach them to a drone that can be flown for eight hours at a time.
“The first application of IMSI catchers was in prison, to find illicit mobile phones. They were the size of a truck. Then they were the size of a car. And we’ve had this [drone] two or three years … And we’re looking at getting things smaller than that,” said a spokesman for the company.
The company’s drone-mounted IMSI catcher has found applications within mountain rescue and the emergency services in remote areas without phone signals as well in anti-wildlife trafficking, locating poachers.
They also work with law enforcement.
The deployment of IMSI catchers is currently forbidden in the UK without the consent of the home secretary.
Privacy International submitted a freedom of information request regarding the use of IMSI catchers by UK law enforcement, which was rejected.
Tom Drew, head of counterterrorism for the data science company, Faculty, said AI should be implemented with great care.
Faculty works with the UK Home Office, Serious Fraud Office and Passport Office and is the official data science partner of the National Crime Agency.
“Something that is incredibly important when thinking about public data sets is privacy,” he said.