1 of 8
Andrew Harnik, Associated Press
In this April 10, 2018, file photo, Facebook CEO Mark Zuckerberg testifies before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill in Washington, about the use of Facebook data to target American voters in the 2016 election. Facebook announced on Wednesday, March 27, 2019, that it is broadening its definition of hate speech to apply to "white nationalists" and "white separatists." The company previously allowed posts from those groups even though it has long banned "white supremacists."

SALT LAKE CITY — Facebook announced Wednesday it will ban “praise, support and representation” of white nationalism and white separatism on Facebook and Instagram, adding that it will direct users who post or search content connected to these ideologies to an organization that helps people leave hate groups.

The policy comes weeks after Facebook received intense scrutiny for how its platform was leveraged by the killer in the Christchurch, New Zealand, mosque shooting to share a 74-page white supremacist manifesto and livestream a video of the killings. According to The Guardian, Facebook did not receive the first user report about the livestream until 12 minutes after it had ended. After the video was removed, there were over 1.5 million attempts to re-upload it — 20 percent of which were successful — raising questions about the social network’s ability to efficiently detect and remove hateful or dangerous content.

The move also comes at a time when white supremacist propaganda is becoming increasingly pervasive. As the Deseret News previously reported, instances of white supremacist propaganda on college campuses in the U.S. increased 182 percent from 2017 to 2018, and almost 500 percent in Utah during the same time period.

The ban marks a departure from Facebook’s previous guidelines, which drew a distinction between white supremacy and white nationalism and separatism. While the social network banned “praise, support, or representation of white supremacy,” it allowed users to express those views for white nationalism or separatism because those ideologies didn't “seem to be always associated with racism (at least not explicitly),” according to a 2018 Motherboard investigation.

Marcio Jose Sanchez, Associated Press
In this April 4, 2013 file photo, Facebook CEO Mark Zuckerberg walks at the company's headquarters in Menlo Park, Calif. Facebook announced on Wednesday, March 27, 2019, that it is broadening its definition of hate speech to apply to "white nationalists" and "white separatists." The company previously allowed posts from those groups even though it has long banned "white supremacists."

In a statement announcing the ban, Facebook explained, “We didn’t originally apply the same rationale to expressions of white nationalism and white separatism because we were thinking about broader concepts of nationalism and separatism — things like American pride and Basque separatism, which are an important part of people’s identity.”

“But over the past three months,” the statement continues, “our conversations with members of civil society and academics who are experts in race relations around the world have confirmed that white nationalism and white separatism cannot be meaningfully separated from white supremacy and organized hate groups. ... Going forward, while people will still be able to demonstrate pride in their ethnic heritage, we will not tolerate praise or support for white nationalism and white separatism.”

Many civil rights groups and nonprofits have applauded Facebook’s decision to ban speech promoting white nationalism and white separatism, but it has also raised questions about how the policy will be enforced and anti-conservative bias online — as well as whether other popular platforms such as YouTube and Twitter will follow Facebook's move and enact similar policies.

Let’s explore answers to each of these questions.

What is the difference between white supremacy, white nationalism, and white separatism?

Originally, Facebook’s policy drew a distinction between white supremacy and white nationalist or white separatist ideologies. But experts have argued that the three are inseparable.

  • The Anti-Defamation League defines white supremacy as a belief system with several tenets: “1) whites should have dominance over people of other backgrounds, especially where they may co-exist; 2) whites should live by themselves in a whites-only society; 3) white people have their own ‘culture’ that is superior to other cultures; 4) white people are genetically superior to other people.”
  • It defines white nationalism as a “euphemism for white supremacy,” adding that some white supremacists have used the term to “refer to a form of white supremacy that emphasizes defining a country or region by white racial identity and which seeks to promote the interests of whites exclusively, typically at the expense of people of other backgrounds.”
  • It defines white separatism as “a form of white supremacy that emphasizes the idea that white people should exist separately from all inferior, non-white races, whether by establishing an all-white community somewhere or removing non-whites from their midst.”

Facebook representatives told Motherboard that an “overwhelming majority of experts” said white nationalism and white separatism have close connections to organized hate, and that “all experts it spoke to” agreed that white nationalistic sentiments expressed online have led to real-world violence and harm, which led the company to change its policy.

Is the ban an example of “anti-conservative bias” online?

Facebook and other social networks have been criticized in the past for allegedly showcasing an “anti-conservative” bias through their algorithms and search engines, though these claims have been hotly contested. Critics of Facebook’s new policy will likely point out the fact that other separatist and nationalist movements aren’t being banned.

As Motherboard reported, Facebook will still allow content relating to black separatist movements and the Basque separatist movement to be posted, due to experts' arguments that white separatism should be handled differently “because of the long history of white supremacism that has been used to subjugate and dehumanize people of color in the United States and around the world.”

" Whiteness is not an ethnicity, it is a skin color. And America is a multicultural society. White nationalism is simply a form of white supremacy. It is in an ideology centered on hate. "
Mark Pitcavage, who tracks domestic extremism for the Anti-Defamation League

However, Facebook does ban content centered on black nationalism, Motherboard reported. The Southern Poverty Law Center has characterized groups that espouse black nationalist ideology as hate groups, with the added proviso that "they should not be seen as equivalent to white supremacist groups — such as the Ku Klux Klan or neo-Nazis — in terms of their history of violence and terrorism."

Others have argued that limiting white nationalist and white separatist speech is equivalent to saying that white people are not allowed to have pride in their ethnic heritage.

Mark Pitcavage, who tracks domestic extremism for the Anti-Defamation League, told The New York Times that white nationalism is not the same thing as ethnic pride or American patriotism.

"Whiteness is not an ethnicity, it is a skin color," Pitcavage said. "And America is a multicultural society. White nationalism is simply a form of white supremacy. It is in an ideology centered on hate."

How will the ban on white nationalist and white separatist speech be enforced?

Brian Fishman, Facebook's policy director of counterterrorism, told Motherboard that the company will use content matching — which consists of using an algorithm to detect and remove flagged images — as well as machine learning and artificial intelligence to enforce the ban, but didn't specify exactly how those tools would be used or what would be flagged.

These tools also only go so far. As The Washington Post reported, last year only about 50 percent of posts using hate speech were removed as soon as they were uploaded to Facebook. The rest of the work falls to human moderators who are trained to decide whether a specific piece of content should be removed from Facebook's platform or not.

Questions have also been raised about the consistency of Facebook's policies and how they are moderated. For example, the social network has allowed certain Neo-Nazi groups to continue maintaining their Facebook pages, finding they did not violate "community standards," the Independent reported.

Jenny Kane, Associated Press
In this March 13, 2019, file photo, Facebook, Messenger and Instagram apps are are displayed on an iPhone in New York. Facebook announced on Wednesday, March 27, 2019, that it is broadening its definition of hate speech to apply to "white nationalists" and "white separatists." The company previously allowed posts from those groups even though it has long banned "white supremacists."

Facebook representatives told Motherboard that the company will not immediately ban "implicit and coded white nationalism and white separatism" because messages that don't use language or images that explicitly reference white supremacy are harder to detect and police.

According to Motherboard, phrases like "I am a proud white nationalist" and "Immigration is tearing this country apart; white separatism is the only answer" would be banned under the new policy. Previously, these types of phrases would not have been banned. Facebook did not provide examples of the types of statements or images that would be allowed under the new policy.

It also remains unclear how Facebook will define white nationalist and white separatist content, something that needs to be explained, Madihha Ahussain, a lawyer for civil rights group Muslim Advocates, told The New York Times. "For example, will it include expressions of anti-Muslim, anti-Black, anti-Jewish, anti-immigrant and anti-LGBTQ sentiment — all underlying foundations of white nationalism?" Ahussain asked.

"Further, if the policy lacks robust, informed and assertive enforcement, it will continue to leave vulnerable communities at the mercy of hate groups," Ahussain added.

Have other platforms enacted similar bans?

Although other popular sites like YouTube and Twitter have community standards that restrict the types of content allowed on their platforms, Facebook "appears to be the first major platform to take a stance against white nationalism and separatism specifically," Louise Matsakis reported in Wired.

Twitter bans accounts affiliated with groups known to promote violence, and YouTube also takes down videos that incite violence. These platforms, along with Facebook, have traditionally leveraged their technology to prevent terrorist groups like the Islamic State or Al Qaeda from sharing videos or posts, but haven't invested as much energy in prohibiting white supremacist groups from posting content, Wired reported.

Steve Helber, Associated Press
In this Aug. 12, 2017, file photo, white nationalist demonstrators clash with counter demonstrators at the entrance to Lee Park in Charlottesville, Va. Facebook announced on Wednesday, March 27, 2019, that it is broadening its definition of hate speech to apply to "white nationalists" and "white separatists." The company previously allowed posts from those groups even though it has long banned "white supremacists."

Concerns have recently been raised about YouTube's recommendations feature, which has been shown to steer users toward extreme content, CNN reported.

The recommendations strategy "makes sense from a marketing perspective," Seamus Hughes, the deputy director of the Program on Extremism at George Washington University, told CNN. "If you like Pepsi then you're going to watch more Pepsi videos ... but you take that to the logical extreme with white supremacy videos."

These companies are "going to have to figure out how to not completely scrap a system that has brought them hundreds of millions of dollars of ad revenue while not also furthering someone's radicalization or recruitment," Hughes added.

Is banning certain kinds of speech the solution?

Other experts add that banning or taking down content shouldn't be the only solution to combating hate speech.

"What happens when large platforms take down this content or these views is that the content just shifts to smaller platforms" which are often more radical, Bill Braniff, the director of the National Consortium for the Study of Terrorism and Responses to Terrorism, told CNN.

"If instead of (re-directing) people who might be flirting with extremism to (supportive resources), if you censor them and remove them from these platforms you lose ... the ability to provide them with an off-ramp," Braniff said.

37 comments on this story

This may be part of the rationale for Facebook's decision to direct users who post or search for white supremacist, nationalist, or separatist speech to the organization Life after Hate, which is run by former far-right extremists.

"Unfortunately, there will always be people who try to game our systems to spread hate," Facebook said in its statement. "Our challenge is to stay ahead by continuing to improve our technologies, evolve our policies and work with experts who can bolster our own efforts."