4 min read

Militant Accelerationism and the Online Radicalisation of Youth

Our Strategic Takeaways

 

1. There is a need to improve information sharing across platforms. Governments and tech firms must share intelligence on extremist networks that migrate between platforms. Systems such as the Terrorist Content Analytics Platform (TCAP) show how collaboration and automated detection can help limit the spread of extremist propaganda and identify emerging networks.

2. Support is crucial for all platforms. Many accelerationist groups now operate on smaller or low-moderation platforms. At Tech Against Terrorism, we provide support for platforms, offering tools, resources, and threat intelligence to help them recognise and remove violent extremist content. 

3. Early warning capabilities should be strengthened. Monitoring online discourse can provide advanced warning of radicalisation patterns and shifts in tactics. Our threat intelligence capabilities enable early detection of online radicalisation trends, providing timely briefings that highlight emerging narratives, tactics, and cross-platform activity.

4. Prevention and intervention work is key. Prevention programmes remain vital for diverting children and youth from radicalisation. However, these efforts are often underfunded and limited by scale rather than demand. Investment in local, digitally informed interventions should be prioritised. 

 

Militant Accelerationism and the Online Radicalisation of Youth

By Peter Smith

The Problem

Across the West, extremist youth movements are being shaped and mobilised through online ecosystems built around militant accelerationism - a neo-Nazi ideology that glorifies violence to hasten societal collapse.

In one video, a masked man wearing neo-Nazi iconography runs up to a parked car in an empty field somewhere in the United States. Breaking the window with a sledgehammer and tossing an object into the back of the vehicle, he flees the frame. Seconds later, an explosion bursts out of the remaining windows, engulfing the lone vehicle.  

In another clip, a teenage boy living in France discharges a craft-made slam-fire shotgun into a wooded area. Shared over a messaging application, the youth explains how he made it with two pipes and some minor welding. Another video posted by the same account shows the young male firing a pistol into what appears to be a basement ceiling. 

In a connected server, a Canadian teen repeatedly shares edited footage of the massacre in Christchurch, New Zealand, that cost 51 people their lives. Similar posts include clips from Brazil, where in 2023 a child wearing a skull mask ran into his school with a knife, killing one teacher.  

In chat servers, some of them with only a few dozen people, North Americans and Europeans trickle in through links shared over other channels, gaming servers, and a series of frequently banned social media accounts. Communicating in English, French, German, and memes, this community of mostly young people is united by a shared interest in neo-Nazi militant accelerationism. 

The logic is that using violence to exacerbate existing social tension and societal division can hasten the presupposed inevitable collapse of society. After the ensuing conflict consumes liberal society, these young soldiers, or their descendants, will build the promised National Socialist-inspired ethnostate. Despite the improbability of this vision, the philosophy has repeatedly proven successful at pushing people into action, especially children and youth. 

One group dedicated to pushing individuals to take action was the Terrorgram Collective, whose alleged members have been arrested in Denmark, the United States, Canada, and Slovakia. The group attempted to operationalise radicalisation through the creation of a culture that celebrates the perpetrators of and urges followers to commit mass shootings and attacks for the white race. While the wave of international arrests of Terrorgram’s leadership has effectively halted its output, the group’s media strategy has had deadly consequences. 

For example, in 2024, an 18-year-old male ran into a tea garden in Eskisehir, Türkiye. Armed with a knife, an axe, and dressed in combat gear and a skull mask, he live-streamed his attack, stabbing five people as he ran through the courtyard. He was apprehended by bystanders nearby as he attempted to flee the scene. Along with his manifesto, which was quickly machine translated and read into an audiobook, the attacker shared 17 PDFs, three of which were Terrorgram publications. 

Why This Matters

Arrests alone are often not enough to dismantle a network. Similar and inspired organisations have continued despite several of their members being charged and imprisoned, taking on new young members and experimenting with different communication platforms.  

Adherents identify themselves as the descendants, aesthetically and ideologically, of neo-Nazi militant accelerationist groups that came before them. While active, these groups came to be understood as different nodes of what some researchers called the “Skull Mask network” - a term derived from the unofficial uniform of a half-face skeletal mouth that many members wear. 

The groups share a look, ideology, and sometimes members, but exist as different points in an interconnected constellation of separate entities. Members of this network encourage acts of terror, attempting to not only organise into a collective but also inspire others to take up arms against what they see as a corrupt and degenerate system. In this goal, they have achieved some success. Members have been implicated in murders, robberies, and terror plots.  

The groups and individuals that comprise the new generation are forging new networks and connections while still wedded to the past generation’s tactics. New and old propaganda are shared, as the uniform and ideology continue to endure.  

Typically young and compulsively online, many in the current iteration of the Skull Mask network are extremely adaptive and frequently adopt new technology, whether it be using cryptocurrency to help anonymise payments or using emerging social media platforms to connect with new potential recruits. 

The programs and knowledge needed to create video and image content have never been more accessible, especially to youth who have grown up with ubiquitous mobile technology. This technology allows for the quick creation of both simple and complex propaganda and information sharing. Congregating on social media platforms, participants share bomb-making tutorials, military manuals, and stylised media glorifying a coming cleansing racial war.  

What Should Be Done

Governments and security apparatuses need to take a multilevel approach to address extremism inspired by far-right militant accelerationist philosophy. The networks that comprise this community have proven to be resilient and a prevailing threat to domestic security.  

Impacting and reducing the reach of these networks would benefit from a holistic approach involving technology companies, educators, practitioners, policymakers, law enforcement, and researchers. Particularly in the United States, there has been a massive defunding of efforts to study or mitigate radicalisation and a de-prioritisation within national law enforcement of teams dedicated to domestic extremism. In the private sector, major technology platforms have been reducing moderation while numerous smaller alternatives propagate, offering low moderation as a feature. Reinvestment in these capabilities, alongside private-sector accountability, is essential to tackling online radicalisation.

Finally, prevention programs remain an important mitigating factor in diverting children and youth who are radicalising online before they reach the level of criminality. Understaffed and typically underfunded, individuals and organisations who do the work of interceding at a local level are more often limited by scale, rather than the need for their services. Investment in local, digitally informed interventions should be prioritised. 

Conclusion 

Militant accelerationism is not only an ideological threat but a distinctly digital one. It thrives in online spaces where anonymity, accessibility, and algorithmic amplification enable young people to be radicalised at unprecedented speed. Disrupting these movements requires more than content removal; it demands coordinated, prevention-driven action. By identifying, analysing, and disrupting the online networks that sustain extremist ideologies, the international community can reduce the space in which violent groups recruit, operate, and inspire attacks.

 

Author Bio 


Peter Smith is an independent researcher and journalist with the Canadian Anti-Hate Network.

Understanding Hash List Sizes

Understanding Hash List Sizes

Combating terrorist and violent extremist content (TVEC) online requires both sophisticated technical solutions and robust collaborative approaches.

Read More