4

Musk’s only response to graphic shooting images is to doubt gunman’s Nazi ties

 1 year ago
source link: https://arstechnica.com/tech-policy/2023/05/twitter-fails-to-remove-label-graphic-images-after-texas-mass-shooting/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Causing distress —

Musk’s only response to graphic shooting images is to doubt gunman’s Nazi ties

Mass shooting images went viral on Twitter despite seemingly violating policies.

Ashley Belanger - 5/8/2023, 5:12 PM

A sign asking people to
Enlarge / A sign asking people to "Pray for Allen, Texas," stands at a memorial to those killed at the Allen Premium Outlets mall after the mass shooting on May 8, 2023, in Allen, Texas.

Graphic images from a Texas mass shooting on Saturday that killed nine (including the gunman) and wounded seven are still circulating on Twitter after spreading virally all weekend. Critics told The New York Times that unlike other platforms, Twitter isn't doing enough to remove or label these "unusually graphic" images, especially in footage where dead bodies of some victims, including a young child, appear to be identifiable, Reuters reported.

Family members do "not deserve to see the dead relatives spread across Twitter for everybody to see,” photojournalist Pat Holloway told the Times. Over the weekend, Holloway joined others in tweeting directly at Twitter CEO Elon Musk to improve the platform's content moderation.

Twitter's policy on sharing content after a violent attack acknowledges that "exposure to these materials may also cause harm to those that view them." That policy is primarily focused on banning the distribution of content created by perpetrators of attacks, but it also places restrictions on "bystander-generated content" depicting "dead bodies" or "content that identifies victims."

Another policy on sharing sensitive media says that "there are also some types of sensitive media content that we don’t allow at all"—including some depictions of deaths, violent crimes, and bodily fluids like blood—"because they have the potential to normalize violence and cause distress to those who view them."

So far, Musk, Twitter trust and safety chief Ella Irwin, and the @TwitterSafety account have not tweeted or commented to clarify how Twitter's policies apply in this case.

Advertisement

Musk did respond to an account tweeting about the gunman and pushing back against a Washington Post report that described the gunman, Mauricio Garcia, as potentially holding neo-Nazi beliefs. A law enforcement official told The Daily Mail that federal agents had reviewed Garcia's social media accounts and found he "had expressed an interest in neo-Nazi views" and could be seen wearing "a patch on his chest reading RWDS"—an acronym used by extremists and white supremacists standing for "Right Wing Death Squad."

"Do they cite any evidence for him being a 'nazi white supremacist'?" Musk tweeted. He seemed to be asking for clarification after boasting that—unlike news reports describing the shooting, in his view—"this platform is hell bent on being the least untrue source of information."

It's possible that images from the shooting spread more quickly on Twitter because the platform notably invests less in content moderation than other platforms. Last fall, Twitter came under fire for gutting its content moderation team and then ditching its Trust and Safety Council. Earlier this year, the European Union told Musk to hire more mods or risk falling out of compliance with the EU's Digital Service Act. At that time, Twitter issued a statement that it intended to comply with the EU order, but so far, Musk seems happier to rely on Community Notes and user reports flagging violative content than restoring the Trust and Safety team to prioritize content moderation.

On Twitter, there's an ongoing debate between users who want to share the images from the shooting to protest gun violence and those like Holloway, who expect Twitter to block such sensitive content. For those who want to share the images, Twitter recommends that users proactively mark them as sensitive media. To do that, "navigate to your safety settings and select the 'Mark media you Tweet as containing material that may be sensitive' option," Twitter's policy directs users. Twitter will also apply the sensitive media filter on violative images reported by users.

Twitter did not respond to Ars' request for comment.

Page:


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK