Turning on the ‘for you’ feed on X shifted political opinions, but turning it off did not

Turning on the ‘for you’ feed on X shifted political opinions, but turning it off did not

Summary

A field experiment analysed the political effects of X’s algorithmic “For you” feed in 2023. Researchers found that switching the algorithmic feed on nudged users’ political opinions to the right. When the algorithm was later switched off, users’ stated political attitudes did not revert, largely because the algorithm had already changed who they followed: users began following more right-leaning accounts while the feed was on, and these follow choices persisted after it was turned off.

The study suggests the feed’s impact operates not only through immediate content exposure but also by reshaping users’ social networks, producing longer-lasting influence.

Author style

Punchy: this isn’t just an academic footnote. The experiment shows a platform tweak can steer politics by altering who people follow — and that the effects can outlive the algorithm itself. If you care about online influence or platform policy, read the original study.

Key Points

  1. Turning on X’s algorithmic “For you” feed in a field experiment shifted users’ political opinions to the right.
  2. Turning the algorithm off did not reverse those shifts in political attitudes.
  3. The algorithm increased users’ likelihood of following right-leaning accounts.
  4. Follow choices persisted after the algorithm was disabled, helping to explain the lasting effect on opinions.
  5. Results imply platform design can produce durable political influence by reconfiguring users’ social networks, not only by momentary content exposure.

Context and relevance

This study builds on broader debates about social-media algorithms and democracy, joining experimental work that tests causal effects of recommendation systems. It matters for regulators, platform designers and researchers because it shows that algorithm-driven exposure can create persistent changes in the information environment of users — a mechanism for long-term influence that simple on/off interventions may not undo.

Why should I read this?

Quick and to the point: if you want to understand how a seemingly small UI change can nudge politics and leave a lasting footprint, this is your short read. It explains why switching off an algorithm doesn’t necessarily undo the damage — because people keep following the accounts they discovered while it was on.

Source

Source: https://www.nature.com/articles/d41586-026-00486-z