Facebook whistleblower Frances Haugen told U.K. lawmakers on Monday that the company’s refusal to take responsibility for its services or incentivize employees to speak up about problematic behavior created the toxic situation that exists today.
“There is an unwillingness at Facebook to acknowledge that they are responsible to anyone,” Haugen said on Monday, testifying at a U.K. Parliament hearing on new legislation aimed at tackling harmful content online.
Haugen appeared in public for a second time since revealing herself as the source behind the numerous internal documents that sparked the Wall Street Journal’s series, “The Facebook Files.” Haugen testified before the U.S. Congress earlier this month, and has since started sharing her trove of documents with numerous news outlets.
Facebook leadership is focused on growth, and has created a culture that focuses on the positive aspects of the company’s services at the expense of dealing with the problems they cause, Haugen said on Monday.
“Facebook is overwhelmingly full of conscientious, kind, empathetic people,” she said. “Good people who are embedded in systems with bad incentives are led to bad actions. There is a real pattern of people who are willing to look the other way are promoted more than people who raise alarms.”
Haugen said Facebook hasn’t put in place ways for employees to point out issues that management should consider addressing or that researchers could examine.
“Facebook has shown over and over again not just that they don’t want to release that data but even when they do release that data they often mislead people,” she said.
It’s an attitude lodged in Facebook’s start-up culture and one that won’t change until the company is forced through regulation to alter its incentives, Haugen said.
“When they see a conflict of interest between profits and people, they keep choosing profits,” Haugen said.
A Facebook spokesperson said in an emailed statement that the company agrees on the need for regulation “so that businesses like ours aren’t making these decisions on our own.” The representative also reiterated Facebook’s disputes from recent stories and said the company has spent ”$13 billion and hired 40,000 people to do one job: keep people safe on our apps.”
Is Facebook evil?
John Nicolson, a member of parliament, asked Haugen if Facebook was just evil.
“What your evidence has shown to us is that Facebook is failing to prevent harm to children, it’s failing to prevent the spread of disinformation, it’s failing to prevent hate speak,” Nicolson said. “It does have the power to deal with these issues, it’s just choosing not to, which makes me wonder whether Facebook is just fundamentally evil. Is Facebook evil?”
Haugen said the word she would is “negligence.”
“I do believe there is a pattern of inadequacy, that Facebook is unwilling to acknowledge its own power,” she said. “They believe in flatness, and they won’t accept the consequences of their actions. So I think that is negligence and it is ignorance, but I can’t see into their hearts.”