6

Meta is downplaying “political content.” Here’s what that really means.

 6 months ago
source link: https://www.washingtonpost.com/politics/2024/02/21/meta-is-downplaying-political-content-heres-what-that-really-means/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
Close

Meta is downplaying “political content.” Here’s what that really means.

89ebdd6c-a27a-40b6-9b15-1605fd2e493e.png&w=196&h=196
Analysis by Will Oremus
February 21, 2024 at 9:20 a.m. EST
The Technology 202

A newsletter briefing on the intersection of technology and politics.

Share
Comment

Happy hump day. Is Wednesday still a hump day if you had Monday off? I say it is if you want it to be. Send news tips to: [email protected].

Below: The House is playing catch-up to the Senate on AI. First:

Meta is downplaying ‘political content.’ Here’s what that really means.

Meta announced earlier this month that it will stop recommending “political content” to most users on Instagram and Threads. The move, which will still allow users to see political content from people they follow, extends a policy first applied to Facebook in 2021. 

As my colleagues Taylor Lorenz and Naomi Nix reported, that came as a blow to activists, newshounds and journalists who had flocked to Threads as an alternative to Elon Musk’s X. Others welcomed the idea of a company that has famously struggled to moderate political discourse stepping back from its role in that realm.

Advertisement

Glossed over in Meta’s announcement, however, was a crucial question: How exactly does the company define political content? 

What counts as political is notoriously tough to pin down. In the 1960s, feminist Carol Hanisch asserted that “the personal is political” as she challenged traditional social hierarchies and family structures. George Orwell wrote in 1946, “In our age there is no such thing as ‘keeping out of politics.’ All issues are political issues … .” That feels even truer today, when culture wars and conspiracy theories swirl around Bud Light and Taylor Swift, and when the science around climate change and vaccines sparks partisan rancor. 

How the world’s largest social media company applies its policy could have ripple effects through its platforms, determining which kinds of influencers and publishers thrive and gain followers and which ones see their audience stagnate. It could also affect Threads’ viability as an alternative to X as a forum for news and debate.

Advertisement

So far, the company has offered only clues about where it will draw those lines. In a blog post announcing the policy, Instagram described political content as “potentially related to things like laws, elections, or social topics.” Laws and elections seem clear-cut enough, as categories go, but “social topics” leaves a lot of room for guesswork.

In a statement to The Tech 202, Meta spokeswoman Claire Lerner offered a bit more detail. 

“Social topics can include content that identifies a problem that impacts people and is caused by the action or inaction of others, which can include issues like international relations or crime,” she said. She added that Meta will work continually to refine its definition over time. 

Any account that posts more than a given threshold of political content, which Meta has not specified, will be ineligible for recommendations. For other accounts, only their political posts will be excluded from recommendations.

Lerner emphasized that Meta isn’t trying to eliminate politics from users’ feeds. “This change does not impact posts from accounts people choose to follow; it impacts what the system recommends, and people can control if they want more. We have been working for years to show people less political content based on what they told us they want, and what posts they told us are political.”

Still, the history of social media teaches us that the people who use online platforms respond to the incentives those platforms create. And attempts to reshape those incentives often have unintended consequences. 

Advertisement

Ryan Broderick, who writes the newsletter Garbage Day, has been watching the effects of Facebook’s attempt to shift away from politics in recent years. He said that overtly political pages and publishers have seen engagement drop, but a page espousing Christian fundamentalism has been topping the charts. Meanwhile, culture-war debates have largely shifted to “topics that Meta does not see as political yet,” such as transgender issues, “men’s rights” or a recent TV episode that touched on a hot-button issue.

“These political fights will still happen,” he said. “They just won’t involve discussions of Washington, D.C., or include legitimate publishers.”

Meta’s pullback from overtly political content could make sense from a business perspective, at least in the short term, said Katie Harbath, a former Facebook public policy director who is now chief global affairs officer at Duco Experts, a tech consulting firm. 

Advertisement

But she agreed that separating the political from the innocuous is easier said than done — especially in a big election year, when politics is “the thing everyone’s talking about.”

While including “social topics” in its definition of political content opens a can of worms, Harbath added, there are good reasons for Meta to do so. During the 2016 election, she noted, ads by Russian trolls seeking to sow division among Americans weren’t immediately recognized as political ads because they were “issue-based” rather than “candidate-based.” 

Harbath said she also worries about the long-term impact of Meta’s policy on civic participation, even if it turns out to be right that users prefer it. “If people are just tuning out of politics and news because they’re burned out, they just want entertainment, does that just feed into the extremes?”

Our top tabs

House leaders unveil bipartisan AI task force
null
Rep. Jay Obernolte (R-Calif.) at a House hearing on AI in October 2023 (Ricky Carioti/The Washington Post)

Rep. Jay Obernolte (R-Calif.) and Rep. Ted Lieu (D-Calif.) will lead the new group, which is responsible for drafting a report on bipartisan AI policy proposals, my colleague Cat Zakrzewski reports. 

The House initiative comes nearly a year after the Senate first unveiled its own bipartisan AI gang. Senate Majority Leader Charles E. Schumer (D-N.Y.) has promised to unveil a framework for AI policy in the near future, but its chances of passing Congress in an election year are increasingly slim. 

The House group includes 12 members from each party, including Rep. Alexandria Ocasio-Cortez (D-N.Y.) and Rep. Darrell Issa (R-Calif.). The initiative will focus on advancing American leadership in AI and work with House committees already developing AI legislation. 

“As we look to the future, Congress must continue to encourage innovation and maintain our country’s competitive edge, protect our national security, and carefully consider what guardrails may be needed to ensure the development of safe and trustworthy technology,” House Speaker Mike Johnson (R-La.) said in a statement.

Agency scanner
Advertisement

Inside the industry

Competition watch

Privacy monitor

Trending

Before you log off

That’s all for today — thanks for joining us! Make sure to tell others to subscribe to The Technology202 here. Get in touch with Cristiano (via email or social media) and Will (via email or social media) for tips, feedback or greetings.

Loading...

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK