Clinical relevance: Two new studies reveal that online hate, misinformation, and incel extremism often stem from underlying mental health struggles.

  • Hate speech communities shared linguistic patterns with forums on antisocial, borderline, and narcissistic personality disorders.
  • Incel extremism followed two psychological paths: one rooted in trauma, the other in toxic traits and ideology.
  • The researchers argue that effective interventions must address mental health, not just online behavior.

A pair of separate studies independently expose the psychological foundation of harmful online behavior. Collectively, the research reveals closely intertwined how hate speech, misinformation, and incel extremism are with mental health struggles and certain personality traits.

Taken together, the new data plots a roadmap for both mental health professionals and policymakers on a better way to navigate the intersection of online extremism and psychological torment.

Tracking Mental Health Through Online Speech

In a sweeping machine learning study appearing in PLOS Digital Health, researchers from Texas A&M and the University of Alabama at Birmingham used OpenAI’s GPT-3 to analyze thousands of Reddit posts from more than 50 communities. These included groups centered on psychiatric disorders, control groups, as well as forums flagged for hate speech or misinformation.

The team generated “semantic embeddings” of each post – high-dimensional vectors that distilled the meaning behind the text – and applied topological data analysis (TDA) to better understand how community members interacted based on language patterns.

The goal? To figure out whether the linguistic fingerprints of hate and misinformation posts resemble those found in mental health communities. And, if so, which ones.

The results were startling. Posts from hate speech communities shared the most linguistic similarity with groups focused on:

  • Antisocial Personality Disorder (ASPD).
  • Borderline Personality Disorder (BPD).
  • Narcissistic Personality Disorder (NPD).
  • Schizoid Personality Disorder.
  • Complex PTSD.

Misinformation posts, on the other hand, skewed closer to general or anxiety-related forums, suggesting a blend of confusion and fear rather than hostility or entitlement.

In short, the language of hate the researchers came across on Reddit doesn’t just echo political grievances or ideological perspectives. More often it reflects the psychological profiles found in communities grappling with serious mental health conditions.

Inside the Incel Mindset

In a separate study published in Archives of Sexual Behavior, researchers from Swansea University and the University of Texas at Austin explored the roots of incel extremism.

The researchers polled 561 self-identified incels from the United Kingdom and the United States and tested what predicts misogynistic, violent, or otherwise harmful beliefs.

The study’s authors proposed a “dual pathways hypothesis:” two distinct paths to incel harm. One stems from experiential vulnerability, including traits such as autism, low self-perceived mate value, and a history of bullying or abuse. The other crops up from dispositional traits, such as Dark Triad characteristics or an extreme right-wing political orientation.

Both paths, the authors insist, ultimately lead to three core risk factors identified using the “3N” model of radicalization:

  1. Psychological Needs (like depression or loneliness),
  2. Ideological Narratives (such as black-pill fatalism),
  3. Online Networks that reinforce and amplify these beliefs.

When these elements interact, the researchers found that the risk of hostile sexism, displaced aggression, and even justification of violence skyrockets.

Researchers also found that poor mental health and ideological commitment predicted harmful attitudes twice as strongly as time spent networking online. While online forums remain a core component of radicalization, they tend to amplify pre-existing vulnerabilities rather than establishing new ones.

A Shared Landscape of Alienation

Together, these studies illustrate a broader (and darker) reality: Online extremism – no matter what form it takes – usually sprouts in soil fertilized by psychological struggle.

Many participants in both studies displayed elevated levels of anxiety, depression, suicidal ideation – as well as neurodivergence.

In the incel study, nearly one in three participants met the clinical threshold for autism screening. And more than 85% of them reported a history of bullying. Despite the media portrayals of incels as ideologically hardened misogynists, the data reveals a group that’s mentally unwell, socially isolated, and usually just ideologically confused.

These subtle distinctions matter. As the authors of both papers stress, not everyone who struggles with mental health or belongs to a marginalized group expresses hate or engages in harmful online behavior. But for those who do, mental distress might dictate more than how they speak. It might determine which ideas they respond to.

Toward More Compassionate Interventions

Fighting back the tidal wave of online hate isn’t just a job for content moderators or law enforcement. It’s also a public health challenge.

And iInterventions that focus exclusively on deplatforming or censorship might overlook the stronger undercurrents that pull some users toward extreme beliefs in the first place.

Instead, both studies argue for integrated strategies that include mental health support, ideology-disrupting education, and safer online environments that don’t just police hate, but offer healthier options to the most vulnerable.

Further Reading

How Online Browsing Shapes – and Reflects – Mental Health

Hidden Use of ChatGPT in Online Mental Health Counseling Raises Ethical Concerns

How Social Media Fuels Self-Delusion