Under Meta’s revamped “More Speech and Fewer Mistakes” moderation policy, Instagram and Threads will begin recommending political content from accounts users don’t follow. This shift, confirmed by Instagram head Adam Mosseri, represents a dramatic reversal from the platform's previous approach, which aimed to minimize the visibility of political content to create a less divisive environment.
For nearly a year, Instagram and Threads defaulted to blocking recommendations of political content from accounts outside a user’s follower list. This approach was in line with Meta’s broader goal of fostering “less angry” spaces and discouraging exposure to polarizing topics like politics and hard news. However, Mosseri recently announced a policy change, saying, “We’re going to be adding political content to recommendations” on both platforms. This pivot coincides with Meta’s new moderation strategy and updated platform rules regarding permissible speech, marking a departure from previous efforts to prioritize user well-being over engagement with sensitive content.
The timing of this policy shift is notable as it aligns with the Trump administration’s return to the political forefront, potentially reflecting Meta's efforts to recalibrate its platforms in anticipation of increased political discourse. In a series of posts on Threads, Mosseri acknowledged the tension between the new direction and his earlier statements. He reiterated his long-standing view that platforms like Instagram and Threads should not actively push political content from accounts users don’t follow, suggesting some internal conflict regarding the decision.
In a video posted to Instagram, Mosseri elaborated on the reasoning behind the change. He pointed out that the demand for political content on Threads contrasts sharply with feedback from previous years, when users felt overwhelmed by such material on Meta’s platforms. This earlier feedback drove the company’s decision to limit the visibility of political posts, focusing instead on less contentious content to create a more pleasant user experience.
The Wall Street Journal highlights several factors that may have influenced the policy reversal. One significant factor is the internal dynamics at Meta, particularly CEO Mark Zuckerberg’s frustration with the platform's content filters. According to reports, Zuckerberg experienced firsthand how filters limited the reach of his posts, including a personal update about recovering from a torn ACL. This experience reportedly fueled his push for less restrictive content moderation policies.
Another key factor is the appointment of Meta’s new policy chief, who is perceived to be more aligned with Trump-friendly policies. This shift in leadership could indicate a broader strategy to accommodate political figures and their supporters, potentially enhancing Meta’s standing in politically charged environments.
While the updated approach may increase engagement by exposing users to more diverse perspectives, it also risks reigniting criticisms that Meta is prioritizing business interests over user well-being. Critics argue that the reintroduction of political recommendations could exacerbate polarization and misinformation, issues that Meta had previously sought to address by de-emphasizing such content.
This policy change underscores the ongoing challenges Meta faces in balancing its business model, which thrives on user engagement, with its responsibility to foster healthy online interactions. By reintroducing political content into recommendations, Meta is taking a calculated risk that could shape the trajectory of its platforms and their role in the evolving digital landscape.