Hate communities often flourish online for years, raising the question of how they persist. My research team has found that powerful stories keep members of a hate group galvanized, either by repeating the story over and over or by constantly adding fresh accusations and interpretations to it.
I’m a computational social scientist who studies social and political networks. My colleagues and I uncovered these trends by examining 10 years of posts, reactions and participation patterns in Facebook groups that shared antisemitic and Islamophobic content. Our findings have been accepted at the 2026 International Conference on Web and Social Media.
First, we measured who was posting and how that related to engagement on a site. Groups in which a small number of people produced most of the content tended to attract more reactions and responses. Then we looked at subjects the group members discussed – religion, immigration, geopolitics – and the kinds of stories members told about those topics, such as describing an entire group of people as criminals or warning that certain types of people are secretly taking over a country’s way of life.
When we put these pieces together, we discovered some clear patterns. Messages posted by a few very active people were strongly associated with higher site engagement in the form of likes and shares in the near term. And repetition – espousing the same ideas again and again – was an effective tactic. We also found that when many users kept adding fresh accusations, conspiracy theories and explanations, a group tended to persist. Very uniform content that used the same framing led to less engagement over time.
Different communities seemed to be drawn to different messaging patterns. In Islamophobic groups, the most prolific posters tended to repeat a narrow, consistent set of messages. Often these were religiously framed posts that portrayed Muslims as morally condemned. In antisemitic groups, the most engaged members were more likely to impart a mix of narratives, from tales of victimization to conspiracy theories about public figures.
A woman protests after a Kashmiri shawl seller was assaulted in India on Jan. 31, 2026.
NurPhoto via Getty Images
Why it matters
Our findings suggest that hate communities can sustain themselves in various ways, so efforts to moderate them should consider these variations. If a few voices drive the conversation, removing them could quiet the noise. If new stories constantly appear from many contributors, harmful ideas may survive even if a few key online accounts are taken down. Hate networks can persist even after social media platforms ban specific groups or accounts.
It is also important to understand how stories can make prejudice feel justified and emotionally compelling. Extremist stories may claim that a group is under attack, that outsiders are dangerous or subhuman, or that violence is the only way to stay safe….


