Data collection, use exclude the most vulnerable: Researcher

A British research body says ‘huge power imbalances’ exist in terms of data governance and usage.

Artificial intelligence research
Technologies such as AI are like climate change, says Kind of the Ada Lovelace Institute, those with the least say are often most affected [File: Reuters]

From tackling diseases to improving transport, technologies such as artificial intelligence have unleashed a wave of opportunities, but society’s most vulnerable citizens are being excluded from them, according to a leading human rights researcher.

The “digitisation of information” impacts every sector in society but not everyone benefits equally, said Carly Kind, head of the Ada Lovelace Institute, a British-based research body named after the British mathematician and computer pioneer.

“We see huge power imbalances in terms of who governs, hoards and uses data, and in what ways,” said Kind who is leading a European Commission-funded project on data governance and privacy regulation.

“Technopolies have put a lot of power in the hands of a few companies, while those most impacted by job automation or digital benefits systems are often the most disadvantaged parts of society.”

Tech giants, once seen as engines of economic growth and a source of innovation, have come under fire on both sides of the Atlantic for allegedly misusing their power and for failing to protect their users’ privacy.

Kind cited the criminal justice system as one area where marginalised communities have been discriminated against by the use of facial recognition and algorithms.

Computers have become adept at identifying people in recent years, unlocking a myriad of applications for facial recognition, but critics have voiced concerns that the technology is still prone to errors.

“Research shows that policing technologies predicting where crime might occur can be informed by biased datasets,” said Kind, a speaker at the Thomson Reuters Foundation’s annual Trust Conference on Thursday.

“That could lead them to wrongly identify black and coloured people as more likely to offend, and create over-policing in certain areas.”

She likened new technologies to climate change, saying that those who had the least say are often the most affected.

Kind said the best way to ensure technology was a “force for good” and used ethically was to involve the public in debating such issues.

“Companies need to be more transparent, and communicate to people how their data is being used,” said Kind, who took up her post in July.

“But the biggest onus is on the state: one of the lessons from Brexit is that people feel disconnected from policymaking.”

Kind called on governments to adopt a “precautionary approach” to adopting new technologies.

“It’s not about banning things or strictly regulating what we don’t understand, but through best practice taking a slow and steady approach and figuring out what will bring everyone along on the ride,” she said.

Source: Reuters