8

Facebook Adds New Penalties for Group Members Who Repeatedly Violate Platform Ru...

 2 years ago
source link: https://www.socialmediatoday.com/news/facebook-adds-new-penalties-for-group-members-who-repeatedly-violate-platfo/608552/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Facebook Adds New Penalties for Group Members Who Repeatedly Violate Platform Rules

Published Oct. 20, 2021
By
Andrew Hutchinson Content and Social Media Manager

Facebook has announced some new penalties for group members who repeatedly violate its rules, as well as new control options for group admins to police potentially harmful content in their groups.

First off, on the new restrictions - Facebook will implement new reach penalties for group members whose posts have previously violated Facebook’s rules anywhere on the platform, restricting their capacity to spread misinformation or hate speech across the board.

As explained by Facebook:

“To continue limiting the reach of people who break our rules, we will start demoting all Groups content from members who have broken our Community Standards anywhere on Facebook. These demotions will get more severe as they accrue more violations.”

That’s significant, because private groups, in particular, remain a problematic element, given that they’re free of public scrutiny. That means that people can share potentially harmful content among community peers who are more open to such, and therefore see fewer penalties as a result, while this change will further restrict their capability to spread the same beyond their own groups, by instituting blanket penalties for all enforcement actions.

So if you share lots of anti-vax stuff in your freedom of speech group, you best keep it there, because any penalty you get for sharing the same on your personal profile will now limit your capability to spread the same everywhere else.

Facebook’s also adding a new moderation element called 'Flagged by Facebook', which will enable group admins to view content that’s been flagged for coming removal before it’s shown to the broader community. 

Flagged by Facebook

As per Facebook:

Admins can then either review and remove the content themselves, or ask for a review by Facebook, and provide additional feedback on why they think that piece of content should remain on the platform. Flagged by Facebook involves admins in content review earlier in the process, before members receive a strike and content is removed.”

That adds an extra human element - and of particular note, humans that are more closely tied to the info being shared, which could help to avoid mistaken removals. It’ll also act as an education tool of sorts to help admins understand the types of posts that Facebook will not allow, which could further improve group interaction and see a reduction in violative content.  

The new options come as Facebook faces more questions over its moderation decisions, in the wake of The Wall Street Journal’s ‘Facebook Files’ expose. With the platform’s motivations being queried, and new evidence showing that its systems can cause significant harm, it’s important for Facebook to provide more tools to help address such concerns, both from a PR and a broader health perspective.

Facebook does invest heavily in such work, and it is continually looking to improve – this is not a knee-jerk response to address these new concerns. But as the pressure mounts, it will be called upon to provide even more tools like this, as well as insight into the actual impact of such efforts on platform use.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK