SALT LAKE CITY — Facebook's announcement that it was banning a half-dozen popular extremists and conspiracy theorists has ignited a heated debate on censorship.
An undercurrent in the discussion that took place on social media sites and news sites is why the social media giant made the move and whether it can effectively enforce what it says is a long-standing prohibition of "hateful treatment of people based on characteristics such as race, ethnicity or religion."
In late March, Facebook posted a statement in its newsroom titled "Standing Against Hate" that announced "a ban on praise, support and representation of white nationalism and white separatism on Facebook and Instagram."
The announcement said the ban would take effect the following week but it wasn't until Thursday afternoon that online news outlets simultaneously reported that the social media platform had banned six extreme right-wing and anti-Semitic leaders, identifying them as "dangerous individuals."
They included Nation of Islam leader Louis Farrakhan, Infowars host Alex Jones, Milo Yiannopoulos, Laura Loomer, Paul Nehlen, and Paul Joseph Watson. The latest bans apply to both Facebook’s main service and to Instagram and extend to fan pages and other related accounts, according to The Associated Press.
The news reports said the individuals and Infowars fell under Facebook's Dangerous Individuals & Organizations policy.
The immediate reaction to the announcement was dominated by cries of censorship from those who say it is justified and others who called the move dangerous and one-sided, favoring liberal commentators and websites.
Even President Donald Trump weighed in over the weekend.
"How can it be possible that James Woods (and many others), a strong but responsible Conservative Voice, is banned from Twitter?," the president tweeted on Saturday. "Social Media & Fake News Media, together with their partner, the Democrat Party, have no idea the problems they are causing for themselves. VERY UNFAIR!"
Politico explored the responses by both sides of the censorship debate.
"Facebook just announced they are finally banning Alex Jones, Milo Yiannopoulos and other racist conspiracy theorists," tweeted Christopher Wylie, the whistleblower who triggered the Cambridge Analytica data scandal. "Let’s not forget that a mere 10 months ago, they were defending these same people’s right to publish their hateful disinformation and harass Sandy Hook families."
The publication quoted an email from Dan Gainor of the Media Research Center, a conservative tech and media watchdog, who "praised the removal of Farrakhan but bemoaned the company’s other actions. 'It's unfortunate that online speech doesn't have enough room for controversial views.'”
Facebook and other social media platforms like Twitter and YouTube have struggled to become a place that fosters a debate of competing ideas and not provide a platform for internet trolls who intentionally spread misinformation to inflame feelings and, in worse case scenarios, incite violence.
Facebook just announced they are finally banning Alex Jones, Milo Yiannopoulos and other racist conspiracy theorists. Let’s not forget that a mere 10 months ago, they were defending these same people’s right to publish their hateful disinformation and harass Sandy Hook families. https://t.co/68dnImTIWd— Christopher Wylie 🏳️🌈 (@chrisinsilico) May 2, 2019
Indeed, past attempts to ban bad actors have been unsuccessful.
After YouTube had banned Jones in August 2018, he was introduced in early April on Logan Paul's podcast via YouTube as "the most banned person in the 21st century,” according to a piece in The Washington Post that explained how controversial figures can find their way around being blocked from social media platforms.
Jones and his Infowars pages had evaded the ban through unofficial pages and people "dedicated to resharing Infowars content, which popped up within days of the official accounts’ removal," according to Wired.
Facebook acknowledges people can successfully "game our systems to spread hate," but the platform says it's also getting smarter. Wired reported after Thursday's announcement that Facebook will deploy a tool reserved for terrorist organizations and self-proclaimed hate groups to close the earlier loophole.
"The company says that users will be prohibited from sharing Infowars videos, radio clips, articles, and other content from the site unless they are explicitly condemning the material. Fan pages or groups that reshare Infowars content in a sincere or appreciative manner won’t be permitted."
It will rely on human content moderators to allow users to share Infowars content for the purpose of criticizing it.
But Facebook's efforts won't stop at just removing content. Its "Standing Against Hate" statement in March said it will also attempt to steer people away from what it deems as hate groups.
"People searching for these terms will be directed to Life After Hate, an organization founded by former violent extremists that provides crisis intervention, education, support groups and outreach."
But Wired and several outlets pointed to Facebook's clumsy rollout of Thursday's ban as an indication the largest social media platform in the world may again fail to filter out violators of its content policies.
Facebook hadn't shut down the banned accounts before the announcement, allowing Jones and others to announce they had been banished and direct their followers to other sites.
A Facebook spokesperson told Wired that "scrubbing all of the accounts from the platforms took much longer than Facebook had anticipated."
Observers also speculated on why Facebook announced its ban.
Some said the social media giant needed some good press in the wake of news that it was expecting the Federal Trade Commission to levy a $5 billion fine for failing to protect its users' personal data.
Indeed, banning hate speech aligns with what many social media users want from Facebook, Jack Morse of Mashable predicted earlier this week. He chided Facebook for boasting at a recent developers conference how its artificial intelligence tools could prevent users from buying drugs over the internet.
"Of course, there are far worse things than weed on Facebook. Like hate speech and misinformation meant to sway elections and attack minority groups," he wrote. "Maybe, and it's just a suggestion, focus more on that and less on making sure we can't buy cheap weed via your digital town square?"
While Facebook did just that last week, the reasons could be deeper than just satisfying what it says are a large cohort of its users. Facebook may also be looking to protect the unique status that digital companies enjoy as arbiters of free expression.
"As the legal scholar Jeffrey Rosen put it, “(The) lawyers at Facebook and Google and Microsoft have more power over the future of … free expression than any king or president or Supreme Court justice,'” wrote Jonathan Peters in the Columbia Journalism Review in November 2018.
He explained that an obsure federal law — Section 230 of the Communications Decency Act, passed in 1996 — unintentionally protects social media platforms against libel claims that would be brought against other publishers or individuals.
"That means platforms such as Facebook, Twitter, and Gab, are not liable for most of their user-generated content. (There are exceptions for certain criminal and intellectual property claims.)"
But Section 203 was written long before social media platforms came on the scene and the public and policy makers are demanding these platforms take more responsible for the content they post.66 comments on this story
Social media platforms have evolved into a media hybrid that doesn't just distribute information but curates and package the information in ways that target specific groups of people, wrote Tarleton Gillespie for Wired nearly a year ago.
Gillespie calls for reforming federal law social media assume the obligation of serving the public in exchange for the license to do business, as telecoms and broadcasters have had to do.
"Until the law is rethought, social media platforms will continue to enjoy the right but not the responsibility to police their sites as they see fit."