T’Nae Parker, a Black activist who’s been on Facebook since college, almost always has the app open on her phone, spending hours each day helping her community in South Carolina – that is, unless Facebook cuts off her access.
By her count, Parker has had her posts removed and her account locked in a punishment commonly referred to as “Facebook jail” 27 times for speaking out against racism or the complicity of white people in anti-Blackness.
“Shutting us out is basically saying ‘Shut up,’” Parker, 36, says.
Now, a Facebook advertising boycott is giving voice to years of complaints that the social media giant disproportionately stifles Black users while failing to protect them from harassment.
Launched by civil rights groups three weeks ago, the Stop Hate for Profit campaign, which has a broader aim of curbing hate speech, white supremacy and misinformation on Facebook, has struck a national nerve. Hundreds of companies including top brands Unilever, Ford and Pfizer have pulled advertising from Facebook in July.
Facebook while black: Users call it getting ‘Zucked,’ say talking about racism is censored as hate speech
Racism in corporate America: Black employees speak out on discrimination in the workplace
On Tuesday, Facebook CEO Mark Zuckerberg and executives Sheryl Sandberg and Chris Cox are scheduled to meet with the civil rights groups behind the boycott including online racial justice group Color of Change, the Leadership Conference on Civil & Human Rights, the NAACP Legal Defense Fund, among others.
Hate, harassment of Black users worsens
Forged in the aftermath of Floyd’s death, this coalition of corporations and civil rights leaders has forced Facebook to reckon with the role it plays in perpetuating systemic racism and societal ills such as Holocaust denial. Demands include Facebook submitting to independent audits of hate speech and misinformation, removing all hate speech and hate groups, refunding corporations when ads appear next to hate speech and hiring a civil rights executive to scrutinize products and policies for discrimination and hate.
“Now is not the time for simply statements of support that don’t come with actual real structural change,” Rashad Robinson, president of Color of Change, said. “Now is the time to actually change rules and change behaviors.”
Save better, spend better: Money tips and advice delivered right to your inbox. Sign up here
If anything, Black users say harassment has gotten worse as nationwide protests following George Floyd’s death in police custody draw renewed attention to historic racial inequities.
Black people in private groups dedicated to discussing racial justice and police brutality report being swarmed by organized networks of white supremacists, who use racial slurs and tell them to go back to Africa.
A network of Facebook groups with more than 1 million members, created to protest coronavirus stay-at-home orders, are also targeting Black Lives Matter, using slurs to refer to Black people and protesters, the Associated Press reported Monday. These groups are rife with conspiracy theories, alleging protesters were paid to go to demonstrations and that Floyd’s death was staged.
Facebook complaints go back years
Civil rights leaders say the boycott is lifting up a years-long effort to stop the censorship of Black users on social media.
Since the start, the Black Lives Matter movement turned to Facebook as an organizing tool. Yet activists say they were soon set upon by bands of white supremacists who targeted them with racial slurs and violent threats. In 2015, Color Of Change, which was formed after Hurricane Katrina to organize racial justice campaigns on the internet, began pressuring Facebook to stop the harassment of Black activists by hate groups.
The Center for Media Justice began probing why content from people of color was being removed from Facebook in August 2016 when, at the request of law enforcement, Facebook shut down the video of a Baltimore woman, Korryn Gaines, who was live-streaming her standoff with police. Gaines was later shot and killed by a police officer in front of her 5-year-old son who was also struck twice by gunfire. At the same time, Black Lives Matter activists and Standing Rock pipeline protesters in North Dakota were reporting that their content was being removed, too.
In 2016 and again in 2017, civil rights and other groups wrote letters urging Facebook to conduct an independent civil rights audit of its content moderation system and to create a task force to institute the recommendations.
Black users complain of little progress
Two years ago, Facebook agreed to a civil rights audit as it rushed to contain the damage from revelations that shadowy Russian operatives posing as Americans had targeted unsuspecting users with divisive political messages to sow discord during the 2016 presidential election. One of the main targets of the Internet Research Agency were African Americans. The same day Facebook announced a civil rights audit, it agreed to a second audit into charges of anti-conservative bias.
In the years since, civil rights leaders say there have been few signs of progress in how Facebook deals with racially motivated hate speech against the Black community or the erasure of Black users’ speech. The final version of the civil rights audit is expected to be released Wednesday.
One of the problems, these leaders say, is the dearth of underrepresented minorities at Facebook, particularly in positions of influence. Despite repeated pledges to close the racial gap in its U.S. workforce, a tiny fraction – less than 4% – of Facebook is Black. In all, Facebook employs 1,025 Black people, according to the company’s most recent government filing. That’s out of 27,705 employees.
Facebook to flag posts that break its rules
Facebook says it will flag all “newsworthy” posts from politicians, including President Donald Trump, that may incite violence, suppress voting or break its other content rules. (June 26)
AP
Last year, a dozen anonymous current and former Facebook employees went public to say they are treated as if they “do not belong,” portraying Facebook as a culture rife with racist and discriminatory behavior against African Americans and Hispanics. Their complaints came one year after a former Facebook manager, Mark Luckie, accused the social media company of having a “black people problem.”
Civil rights groups want hate speech rule changes
Facebook relies on a set of rules called “Community Standards” to guide decisions about what constitutes hate speech. These standards are enforced by computer algorithms and human moderators. Facebook acknowledges it makes mistakes when flagging and removing content.
SiliconValleySoWhite: Black Facebook and Google employees speak out on big tech racism
Sandy Broadus, an attorney and anti-racism educator and activist, says when she and another Black woman were called a profanity by a white man, Facebook refused to take down the comment. A friend reported being called a racial slur but was told that didn’t break the rules either, according to a screenshot reviewed by 360aproko. But when that friend posted a screenshot of the slur on Facebook, she was banned. Last year a post saying “14-year-old Emmett Till was murdered by racist and knee-jerk terroristic white men” sent Broadus to Facebook jail, but not the white allies who reposted it.
The problem as Broadus sees it: the company’s policy of protecting all racial and ethnic groups equally, even if they do not face oppression or marginalization, ignoring centuries of systemic racism and confusing commentary on racism with attacks on a protected group of people.
Neil Potts, public policy director at Facebook, told 360aproko last year that applying more “nuanced” rules to the daily tidal wave of content rushing through Facebook and its other apps would be challenging.
Facebook says it’s taking steps to address mistakes in content takedowns. New hate speech pilot programs are training some content moderators as specialists in hate speech and are reevaluating the questions content moderators are asked to consider when they make decisions.
“The question for me becomes if you are telling me that these things don’t violate community standards, then what you are saying is that they don’t violate white community standards because they certainly violate those of us who are legally protected classes under the law,” Broadus says. “Facebook is operating in their own little microcosm where they are treating white fragility and white feelings as if those are legally protected, and they are not.”
Facebook too useful to quit
Still, Black users say they have no choice but to stick with Facebook, a critical lifeline to their communities. Movements on social media have helped put the deaths of Black Americans by police officers on the public agenda, along with racial disparities in employment, health and other key areas, they say.
To elude bans when talking about racism, Black users say they resort to using a white avatar, digital slang such as “wypipo” or “huite” or a cloud emoji as phonetic or visual stand-ins for white people. They operate under aliases and maintain back-up accounts to avoid losing content and access to their community. And they’ve developed a buddy system to alert friends and followers when someone has been sent to Facebook jail, sharing the news of the suspension and the posts that put them there.
“This movement,” Parker says, “does not exist without social media.”
Like others, Parker puts up with being cut off from her community for hours, days or weeks at a time. In all, she estimates she’s been locked out of her Facebook account for a total of two years, isolating her in critical moments when people need her, such as the start of the COVID-19 pandemic.
Last month when Facebook launched a campaign after Floyd’s killing to “Lift Black Voices,” an effort to highlight stories from Black people and share educational resources, Parker was incredulous.
“When you are the No. 1 oppressor of Black voices. You are going to lift Black voices?” she said. “That is absolutely not what you do.”