The networking problem brought down services used by more than 2.75 billion people.
A whistleblower accused Facebook of putting profit before people as she told Congress its products harm the mental health of some young users, stoke divisions and weaken democracy.
During a Senate Commerce subcommittee hearing, Frances Haugen called for transparency about how Facebook entices users to keep scrolling, creating ample opportunity for advertisers to reach them.
“As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable,” said Haugen, a former product manager on Facebook’s civic misinformation team. She left the nearly $1 trillion company with tens of thousands of confidential documents.
“The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed,” Haugen said.
Her testimony came a day after Facebook and two of its main services, Instagram and the messaging app WhatsApp, suffered an hours-long global outage, and after weeks of mounting pressure on the social media company to explain its policies for young users.
Haugen went public in an interview with CBS on October 3 and revealed she was the one who provided documents used in a Wall Street Journal investigation and a Senate hearing on Instagram’s alleged harm.
The WSJ stories showed the company contributed to increased polarisation online when it made changes to its content algorithm; failed to take steps to reduce vaccine hesitancy, and was aware that Instagram harmed the mental health of teenage girls.
Hours after Haugen’s testimony, chief executive Mark Zuckerberg, defended his company in a public Facebook post, saying the accusations were at odds with Facebook’s goals.
“The argument that we deliberately push content that makes people angry for profit is deeply illogical,” he wrote. “We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed.”
A Facebook spokesperson, Kevin McAlister, earlier said in an email to the Reuters news agency that the company sees protecting its community as more important than maximising profits. He also said it was inaccurate to say that leaked internal research demonstrated that Instagram was “toxic” for teenage girls.
That echoed testimony Facebook’s head of global security, Antigone Davis, delivered before the same Senate committee last week. “We care deeply about the safety and security of the people on our platform,” Davis said at that time.
“We take the issue very seriously … We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”
In an era of deep political divisions in Washington, DC, both Republican and Democratic lawmakers agreed on the need for big changes.
In an opening statement, Democratic Senator Richard Blumenthal, who chairs the subcommittee holding the hearing, said Facebook knew that its products were addictive, like cigarettes.
“Tech now faces that Big Tobacco jaw-dropping moment of truth,” Blumenthal said.
He called for Zuckerberg to come before the committee, and for the Securities and Exchange Commission and Federal Trade Commission to investigate the company.
“Our children are the ones who are victims. Teens today looking in the mirror feel doubt and insecurity. Mark Zuckerberg ought to be looking at himself in the mirror,” Blumenthal said.
Senator Marsha Blackburn, the top Republican on the subcommittee, said that Facebook turned a blind eye to children below age 13 on its sites. “It is clear that Facebook prioritises profit over the wellbeing of children and all users,” Blackburn said.
Al Jazeera’s Shihab Rattansi, reporting from Capitol Hill, said regulating content on Facebook and other social media platforms could prove tricky for Congress, given the protections afforded to free speech under the First Amendment.
“The question becomes, ‘Well what criteria will be used and who will have oversight of that’,” Rattansi said.
Still, Jason Kint, CEO of the Digital Content Next trade organisation, said Tuesday’s hearing was significant. “What’s different about this moment is we have evidence coming from inside the building,” he told Al Jazeera.
“What this hearing provides is that evidence that they knew and that there was actual empirical data supporting all of these downstream harms of the way the platform works.”