6

Five things to know about Facebook’s Trump decision

 3 years ago
source link: https://www.washingtonpost.com/technology/2021/06/04/trump-facebook-ban-5-things/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Five things to know about Facebook’s Trump decision

The social media giant banned the former president for two years and clarified some content moderation rules

Listen to article
President Donald Trump taps a phone at the White House in June 2020. He was a frequent user of social media. (Leah Millis/Reuters)
June 4, 2021 at 7:33 p.m. UTC

After months of back and forth with its own Oversight Board, Facebook announced Friday that former president Donald Trump would remain banned from the site for at least two years.

The timeline was part of a longer reply to recommendations from the board about how Facebook could better handle political figures and newsworthy posts. Of the 19 recommendations put forth by the Oversight Board one month ago, Facebook said, it will fully or partially implement 16 of them.

The group of 20 experts from around the world was created and funded by Facebook to help handle its most difficult content moderation issues and to show concerned lawmakers that it could self-regulate. (The company says the board is not a substitute for regulation). The board’s decision on Trump last month was its most consequential to date.

Advertisement

Here’s what you need to know about Facebook’s response.

Trump is still banned from Facebook, but only for two years

The company banned Trump from Facebook and Instagram “indefinitely” in January, leaving up his pages on both sites and allowing people to comment, but blocking him from accessing or posting to them. In its decision last month, the Oversight Board agreed with the ban in the immediate moment but took issue with what it described as an arbitrary indefinite penalty, and asked for clarity on how long it would last. In response, the company created new rules for how it handles public figures and applied them retroactively to Trump’s account.

Facebook’s solution is to make the ban last two years, meaning that in 2023, the year before the next presidential election, Trump could be allowed back on the site. There are some caveats. At the end of the two-year period, Facebook will consult with “experts” about letting him back online. It will principally weigh the risks to public safety from his use of the service.

Advertisement

If Trump’s suspension is ended, he will essentially be on probation and subject to strict punishments if he violates the social network’s rules again. In that case, he could face having his pages removed permanently.

Facebook says it will be more transparent about the consequences for breaking its rules

The most concrete change Facebook made Friday was outlining some specific consequences for users who break its content policies. And those apply to everyone — not just public figures.

The Oversight Board criticized the company for making arbitrary decisions rather than establishing a set of transparent rules for all users, and Facebook said it was responding to that.

It will shed light on its strike policy, which was previously opaque on what users needed to do to get kicked off. Now, breaking policies will lead to progressively longer bans. One strike merits a warning, while two strikes will block the user from posting to the site for one day. The strictest penalty is a 30-day ban for five or more strikes.

Advertisement

This system is significantly less strict than YouTube’s strike system, which has been public and operational for years. YouTube bans accounts permanently for getting three strikes within 90 days. The Washington Post has previously reported, however, that influential users got preferential treatment under YouTube’s penalty system.

But even under the new system, breaking Facebook’s rules won’t automatically result in a strike.

“Whether we apply a strike depends on the severity of the content, the context in which it was shared and when it was posted,” the company said in a blog post about the new rules.

Politicians and public figures will face more scrutiny from content moderators

Much of the conversation about content moderation and social media has centered on politicians using the platforms to intimidate opponents and rally their followers, sometimes into violent action. Trump’s encouragement of the Jan. 6 rioters was what got him kicked off the platform.

Advertisement

But other public figures around the world, such as Philippine President Rodrigo Duterte, former Ecuadoran president Rafael Correa and Brazilian President Jair Bolsonaro have also used Facebook in ways that human rights organizations say have led to violence and broader public harm.

Facebook has defended leaving up some posts from politicians under its “newsworthiness exemption” — saying the public has a right to know what their leaders are posting. But now, the company says it will institute harsher rules for public figures who incite violence during times of civil unrest.

“Public figures often have broader influence across our platforms; therefore, they may pose a greater risk of harm when they violate our policies,” the company said. A public figure is anyone holding state or national office, political candidates, accounts with more than 1 million followers, and those who “receive substantial news coverage.”

Advertisement

The shift in approach to politicians is essentially following in the footsteps of Twitter, which made changes to its own public-interest exemption in 2019 in order to show when public figures broke its rules.

The newsworthiness exemption still exists, and it’s still vague

Facebook and other social media platforms have generally allowed posts to break their rules against violence or nudity if they are newsworthy, such as a documentary or news report about a war or a police shooting. Facebook has also applied this to posts from some politicians — though it hasn’t said which ones and when.

The company told the Oversight Board that it hadn’t applied the exemption to Trump’s posts, but it walked that back Friday by saying it “discovered” that it had done so in 2019 with a video Trump posted of one of his rallies. The Post previously reported that the exemption was developed in response to Trump. Facebook has long avoided publicly discussing the number of times it has applied the exemptions or the details of the cases globally.

Advertisement

Facebook said it will take a variety of factors into account when making exceptions for newsworthiness, such as whether it helps other people avoid danger or what the political situation is in the country at the time of the post. Newsworthy content can still be taken down if it could lead to violence or other kinds of harm, Facebook said.

What does this mean for the other big social media platforms?

Facebook’s decision was being closely watched because of the site’s status as the world’s largest social platform, and one that has played a key role in the rise of Trump and other leaders around the world.

Spokespeople for Google’s YouTube have said the company makes its own decisions about content moderation, and won’t be pushed in either direction based on Facebook’s decision or recommendations by its Oversight Board.

Advertisement

Still, because of Facebook’s size, it’s feasible that its position could set a norm for other companies. Twitter, which Trump used daily during his presidency, permanently banned him rather than issuing the kind of indefinite suspension that Facebook and YouTube opted for.

For YouTube, there could now be renewed pressure to clarify its own policy and put a timeline on Trump’s suspension, as Facebook has done.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK