Facebook has announced that it disabled its topic recommendation feature after it mistook Black men for “primates” in a video on the social network.
A Facebook spokesperson called it a “clearly unacceptable error” and said the recommendation software involve was taken offline.
Keep readinglist of 3 items
“We apologize to anyone who may have seen these offensive recommendations,” Facebook said in response to an AFP inquiry.
“We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again.”
Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.
Facebook users in recent days who watched a British tabloid video featuring Black men were shown an auto-generated prompt asking if they would like to “keep seeing videos about Primates,” according to the New York Times.
The June 2020 video in question, posted by the Daily Mail, is titled, “White man calls cops on black men at marina.”
While humans are among the many species in the primate family, the video had nothing to do with monkeys, chimpanzees or gorillas.
A screen capture of the recommendation was shared on Twitter by former Facebook content design manager Darci Groves.
“This ‘keep seeing’ prompt is unacceptable,” Groves tweeted, aiming the message at former colleagues at Facebook.
“This is egregious.”
The social media giant founded by Mark Zuckerberg has been facing several controversies in recent years.
In 2020, hundreds of advertisers signed on to the Stop Hate for Profit campaign, organised by social justice groups including the Anti-Defamation League (ADL) and Free Press, to pressure Facebook to take concrete steps to block hate speech and misinformation, in the wake of the death of a Black man, George Floyd in police custody.
In a 2019 Al Jazeera piece, David A Love, a Philadelphia-based freelance journalist and media studies professor, also alleged that Zuckerberg’s company is willingly “enabling hate groups, white nationalists and far-right extremists”.