Extremism in Gaming Spaces: Policy for Prevention and Moderation
This policy brief seeks to identify recommendations for governments, regulators and other international policymaking entities to design effective policies for preventing and countering violent extremism policies in gaming spaces, to enforce standards, and to support the development of capacity to moderate online harms.
Introduction and Context
Online experiences, including gaming, are becoming integral to daily life, and violent extremists and terrorists are increasingly exploiting these spaces. While current research recognises the role of extreme content — such as hate speech, conspiracy theories and disinformation — in reducing resilience to radicalisation to violence, there are still significant gaps in addressing these issues within gaming spaces. With gaming culture expanding rapidly and blending with popular entertainment culture, there is a pressing need to bridge these gaps in both research and policy.
So far, research on gaming and extremism has been limited to anecdotal evidence, a small number of pilot studies, and theoretical exploration, leaving much to understand about how gaming spaces can contribute to radicalisation to violence. Key aspects, such as the transnational nature of gaming spaces and the role of culturally-informed identities, are still underexplored. Furthermore, the gendered dynamics of gaming and the impact of discrimination based on identity factors, such as gender, race and religion, have been largely overlooked in existing research.
This policy brief is part of a project funded by Public Safety Canada titled Examining Socialization with a Nexus to Radicalization Across Gaming (-Adjacent) Platforms Through a Gender Lens, which helps build an evidence base on the effects of toxicity, harassment and hate-based discrimination on individual users and group dynamics and their vulnerability to radicalisation to violence. The project uses a Gender-Based Analysis Plus (GBA+) approach to address these gaps, capturing the importance of gender and other intersecting identity factors.
Based on the project’s findings, this brief seeks to identify recommendations for governments, regulators and other international policymaking entities seeking to design effective preventing and countering violent extremism (P/CVE) policies1. ;for gaming spaces, to enforce standards through regulatory efforts, and to support the development of tech company and civil society capacity to moderate online harms. While tech companies also have a policy role to play to reduce harms on their platforms and through their products, this brief highlights recommendations for entities that are responsible for P/CVE as part of a broader public safety approach.
Split into two interconnected sections, the brief offers key recommendations for prevention and moderation, with a focus on the importance of multistakeholder collaboration.
Methodology
The project’s research design incorporated a mixed-methods approach, combining quantitative and qualitative data collection to examine socialisation processes in gaming and gaming-adjacent platforms across seven countries: Canada, the US, the UK, Germany, France, Australia and Indonesia. Data collection included a survey of over 2,200 gamers. Survey data was analysed to understand the prevalence of hate-based discrimination, extremist content and the role of identity formation and community building in gaming spaces.
These findings were supplemented by analysis of text- and image-based extremist and hate-based discriminatory content gathered from a variety of online gaming-adjacent forums, including Steam, Reddit, X, TikTok, Facebook and others. The project employed natural language processing2. ;and other content analysis techniques to analyse the vast amount of data collected, identifying trends in toxic behaviour and the spread of extremist ideologies.
Key Recommendations for P/CVE Policies and Interventions
Effectively countering radicalisation in gaming spaces requires a nuanced approach that considers the diverse ways in which gamers engage with these environments. Violent extremists exploit these platforms by capitalising on normalised toxicity, sense of community and high levels of engagement to spread harmful ideologies. To address these challenges, P/CVE policies and interventions should focus on fostering inclusive gaming communities, promoting positive gamer identities and leveraging existing efforts and tools to build resilience to radicalisation.
Recommendation 1: Target P/CVE Interventions Based on Popular Game Genres
The project’s research derived several key trends in gaming preferences and engagement patterns from surveys conducted with gamers. Across the case study countries, multiplayer online battle arena, first-person and third-person shooter, and role-playing games were found to be the most popular game genres.
Policymakers should therefore collaborate with game developers and platform providers to create voluntary, unobtrusive P/CVE interventions targeting platforms and game genres based on their relevance and popularity with gamers. Furthermore, such partnerships can be formed around and focused on tailoring interventions to the game genres and the platforms where players spend most of their time. Governments should engage civil society organisations (CSOs) and gaming communities to co-create interventions that are organic parts of the gaming environment. For example, this could include existing in-game charity or social campaigns. Interventions could also include in-game prompts that challenge extremist narratives, storylines that promote pro-social behaviours, and educational content integrated into games. Multistakeholder engagement with CSOs, developers and influencers could help to tailor these interventions collaboratively to maximise impact for relevant audiences.
Recommendation 2: Improve Awareness of Reporting Tools
The project’s survey results revealed that one-third of gamers had been exposed to images, videos or symbols that promote extremism, and witnessed people endorsing violence. Roughly one in four gamers surveyed encountered content suggesting they should join extremist groups. Another concern is the normalisation of toxicity in gaming spaces, which is similarly commonplace, with nearly one-third of gamers viewing toxicity as normalised in gaming spaces. To ensure that players are empowered to take action against these types of harmful content, raising awareness of existing reporting tools in gaming spaces is essential. Without widespread awareness, even the most effective reporting tools will remain underutilised, allowing harmful behaviour to persist.
Governments should partner with gaming platforms to embed frequent reminders and tutorials within games, educating players on how to report harmful content. These efforts could include integrating reporting tutorials into onboarding processes when players first start using a game or platform. In-game pop-ups or messages designed for maximum engagement can be employed to regularly remind players of the importance of reporting harmful behaviour, guiding them through the steps if necessary.
Recommendation 3: Offer More Educational Initiatives for Gamers
The observed normalisation of harassment and toxicity in gaming spaces can reinforce harmful stereotypes and foster identity-based discrimination and in-/out-group dynamics, which not only creates more hostile environments but also allows for the proliferation of extremist ideas. Education can be a powerful tool for challenging the normalisation of extremist content and toxic behaviour in gaming spaces, including organic forms of education such as peer-to-peer and community-led.
In collaboration with game developers and NGOs, governments should work to incorporate educational content directly into games and gaming forums. This could take the form of narrative elements within games that promote respect, inclusivity and understanding, even within the popular competitive titles that can foster more toxic community behaviour. For example, storytelling within popular games could include characters who challenge extremist ideologies, or plotlines that promote tolerance and critical thinking. Developers could also integrate prompts into gameplay that encourage players to reflect on the impact of their in-game behaviour, particularly in multiplayer settings. Beyond the games themselves, governments could support NGOs in creating educational campaigns that can be embedded in gaming forums, discussion channels and livestreams on gaming-adjacent platforms.
Recommendation 4: Design Gender-Responsive P/CVE Strategies
Female and LGBTQI+ gamers face unique challenges in gaming spaces, often becoming primary targets of gender-based harassment and toxic behaviour. Survey data from this project indicated that more than half of female gamers have experienced harassment based on gender identity, with many citing a lack of inclusivity and support in these environments. Addressing the gendered nature of harassment is vital for preventing long-term harm and ensuring that gaming spaces are safer and more inclusive for all users.
Governments should partner with CSOs and gaming platforms to create dedicated safe spaces, including with relevant proactive support mechanisms and peer-led services, for female and LGBTQI+ gamers. These dedicated safe spaces, such as Rainbow Arcade on Twitch, feature a zero-tolerance stance for discrimination, hate and harassment in all forms and are thoroughly enforced by administrators, moderators and community managers. Educational content tailored to these platforms can focus on addressing gender-based violence, harassment and discrimination, highlighting the personal and collective impact of these issues. This could include launching gender-specific awareness campaigns that directly confront misogyny and anti-LGBTQI+ sentiments within gaming communities, for example by partnering with popular gaming influencers and developers to spread positive messages.
Recommendation 5: Utilise Online Engagement for Positive Influence
Gamers spend considerable amounts of time online, with project survey results and other sources suggesting an average of 8 to 20 hours per week for most gamers. This prolonged engagement offers numerous touchpoints to promote positive community interactions and resilience to radicalisation. Rather than viewing gaming purely as a risk, the time spent online can also present an opportunity to influence gamers positively.
Policymakers should provide funding for NGOs, CSOs and grassroots community groups to run regular outreach initiatives that engage gamers. These could take the form of in-game events or tournaments that promote pro-social behaviour, inclusivity and positive messaging. Government support can come in the form of grants or funding for initiatives or awareness-raising campaigns. For example, P/CVE-specific campaigns could be integrated into popular communication platforms such as Facebook Messenger, Instagram and in-game chats on popular platforms, capitalising on the significant user base of these platforms. By funding and supporting community-led initiatives, policymakers can ensure that gamers themselves play an active role in promoting a safe and inclusive environment.
These types of initiatives can also help to build trust between diverse gamer groups, which is crucial for preventing discrimination and radicalisation. Initiatives bringing together gamers from different backgrounds, such as through mixed-identity gaming tournaments, collaborative events and cross-cultural challenges, could be organised by CSOs and supported by governments through grants or public–private partnerships. Platforms and multiplayer games that are commonly used by diverse groups can serve as effective venues for these interventions. Governments can also partner with gaming companies to promote these events, providing in-game rewards to participants or creating exclusive content tied to inclusive campaigns, ensuring a broad reach without direct government involvement.
Recommendation 6: Leverage Existing Influence and Relationships
Influencers and content creators have established relationships with diverse audiences within gaming communities, and their voices carry significant weight. These individuals can act as powerful allies in spreading P/CVE messages and promoting resilience to radicalisation.
Governments and CSOs should collaborate with influencers and/or content creators by providing them with resources, training and incentives to promote inclusivity and resilience against extremist content. For example, policymakers can partner with gaming platforms to create funding opportunities for influencers who commit to promoting anti-extremist messaging and fostering positive online interactions. These initiatives could include partnerships with influencers who create ‘Let’s Play’ content or livestreams promoting tolerance, with governments offering technical support, training or resource materials on extremist content, while minimising direct involvement to avoid a perception of inauthentic messaging.
Partnerships should also prioritise the safety and wellbeing of content creators, ensuring any initiatives with government or CSO support do not cause negative backlash. Influencers should be encouraged to create content that resonates with their audiences and is culturally relevant, ensuring that P/CVE messaging is authentic and credible.
Recommendation 7: Promote Allyship Programmes
When it comes to allyship within gaming spaces and communities, the survey data3. revealed that most gamers have, at least once within a year, stood up for someone else being harassed online within their own community, with slightly fewer having done so for someone from a different community. Nearly two-fifths of gamers surveyed, however, indicated that they have never stood up for someone experiencing harassment, including within their own community.
Given these trends, governments should consider funding community-driven allyship programmes that encourage gamers to support one another in maintaining respectful, inclusive environments. These programmes could be co-created with game developers, gaming platforms and CSOs to ensure they resonate with the gaming community. Empowering gamers to be active bystanders rather than passive participants helps prevent toxic behaviour from becoming further normalised. By promoting allyship within gaming communities, governments and platforms can reduce harmful content and increase self-regulation, fostering a sense of collective responsibility.
Recommendation 8: Expand Support Systems for Targeted Gamers
Nearly three-quarters of gamers surveyed reported witnessing harassment, including nearly half also experiencing harassment themselves, with younger people (18–28 year olds) reporting the highest rates. Gamers who experience harassment, hate speech or extremist content require accessible and comprehensive support resources, ensuring that they feel safe and supported within gaming spaces.
Governments should support the creation of robust support systems for gamers, working in partnership with NGOs, CSOs, platform providers, influencers and content creators. These systems could include counselling services, mental health support and resources for reporting harassment or extremist content. Such resources could be promoted through popular gaming and gaming-adjacent platforms, ensuring that gamers know where to turn for support and encouraging victims to seek help early.
Key Recommendations for Regulation and Content Moderation
While P/CVE and related interventions are critical for preventing radicalisation and building resilience within gaming spaces, they are only part of a comprehensive approach to combating extremist and other harmful content. The inherently global and fast-evolving nature of online platforms means that regulation and content moderation are equally necessary to ensure that digital ecosystems are less conducive to the spread of extremist and harmful content.
Recommendation 9: Develop Regulatory Frameworks to Address Online Harms
Evidence of exposure and harm in gaming spaces indicates that it is vital to build frameworks that require transparency and accountability from digital platforms in efforts to prevent and combat online harms. Governments should pursue regulatory frameworks for online harms mitigation, including with an emphasis on human rights considerations and the protection of children and adolescents.
The EU’s Digital Services Act and the UK’s Online Safety Act 2023 provide examples of regulatory efforts that may offer useful lessons for other countries. Regulators are building an understanding of the range of harms that can be regulated and how to achieve this, as well as the entities that need to be engaged with and how to design regulation for their differing levels of resources across tech companies. However, it is important to note that regulation will remain a challenge if there is limited ability apply the same level of scrutiny and legislation to small or fringe platforms as to mid- to large-size platforms.
Recommendation 10: Strengthen Platform-Level Moderation Efforts
Extremist content — including recruitment attempts and hate-based narratives — often proliferates in spaces with insufficient moderation, particularly in fast-moving environments such as gaming chats and livestreams. Stronger moderation is essential for preventing the escalation of toxic behaviours into radicalisation, and ensuring that platforms remain safe spaces for all users.
Therefore, policymakers should collaborate with gaming (-adjacent) platforms to strengthen content moderation and anti-harassment policies, including providing clear definitions and precedent on which tech companies can implement moderation decisions. This collaboration could involve developing and implementing AI-driven moderation tools that enhance moderation capabilities across communication modalities without infringing on user privacy. These tools could leverage machine learning models to detect hate speech, harassment and extremist narratives in real-time, thereby helping platforms to handle the volume of content produced by gamers while maintaining user engagement. Additionally, governments should encourage platforms to be transparent about how their moderation tools function, as the Digital Services Act Transparency Database currently does.
Recommendation 11: Enhance Moderator Training to Address Toxicity and Identity-Based Exclusion
Moderators are often the first line of defence in maintaining safe and inclusive online spaces. However, many moderators lack the training and resources to effectively address extremist content or manage identity-based exclusion and harassment. Effective moderation can mitigate toxicity and prevent the escalation of harmful behaviours that can lead to radicalisation. Ensuring moderators are equipped to address these issues is essential for creating resilient gaming communities, as well as for reinforcing their effectiveness. By offering enhanced training and resources, companies can demonstrate commitment to regulatory requirements and the trust and safety policy of platforms.
Governments should fund and support training programmes for community moderators across popular platforms, providing tools and resources to help them manage toxic behaviour and identity-based exclusion. This can be through public–private partnerships, where gaming platforms collaborate with CSOs to provide training focused on managing conflict, identifying extremist content and fostering inclusive interactions. Policymakers can also work with industry leaders to ensure that moderation tools include features that promote positive community engagement, such as flagging harmful content or providing users with immediate access to support resources.
Recommendation 12: Implement Gender-Sensitive Training for Moderators
Moderators are often ill-equipped to recognise, understand the importance of and address the specific challenges that female and LGBTQI+ gamers face, especially considering the historically male-dominated nature of gaming spaces. Without specialised training, moderators may overlook or fail to respond adequately to gender-based harassment, allowing toxic behaviours to go unchecked. Ensuring that moderators have the tools and knowledge to manage gender-based issues is essential for reducing harmful interactions and creating safer online spaces.
Governments should partner with technology companies and platform providers to develop and implement gender-sensitive moderation protocols that enable quicker and more effective responses to gender-based harassment. This could also include establishing training programmes for moderators that focus on identifying and addressing misogynistic behaviours and harassment targeting female and LGBTQI+ gamers. Governments should also consider working with international bodies and NGOs to standardise these training programmes across platforms and languages. This would involve creating guidelines for identifying harassment that goes beyond explicit hate speech and includes more subtle forms of exclusion or abuse based on gender identity.
Recommendation 13: Support the Creation of Inclusive Game-Design Guidelines
The representation of gender and other intersecting identities in video games has long been skewed toward toxic masculinity, often excluding, distorting or marginalising characters that are female, LGBTQI+ or other diverse identities. Games that rely on hyper-masculine narratives can contribute to the culture of misogyny that persists within many gaming spaces, making it harder to foster inclusive environments. Encouraging more diverse and representative game design can challenge these norms and help to promote inclusivity.
Governments can incentivise game developers by leveraging key allies or outlining business benefits to adopt inclusive design guidelines that reflect diverse experiences in character representation, storylines and gameplay mechanics. Governments can also work with gaming industry leaders to develop and promote best practices for inclusive game design, potentially contributing to an industry-wide shift toward more diverse representation. Collaborations between governments, game developers and CSOs could result in the creation of toolkits or frameworks that guide developers in creating games that are inclusive of all genders.
Recommendation 14: Strengthen Reporting Systems
Effective reporting mechanisms are critical for empowering gamers to identify and counter harmful content, including harassment and extremism. However, while the survey results revealed that although most gamers know how to report harmful content on at least one platform, only about half have reported harmful content on one platform, with a minority doing so on all platforms. This has often been due to a lack of follow-up and clarity for those who have reported content that any action is being taken. By improving reporting mechanisms and feedback processes, platforms can increase user engagement in maintaining safe and inclusive environments.
Governments should collaborate with platforms to simplify and enhance reporting processes to be more intuitive, with fewer steps, clearer instructions and visible options for users to track the status of their reports. This could be complemented by introducing co-branded educational campaigns between governments and platforms to raise awareness and ensure that users understand how to report harmful content.
Conclusion: Global Multistakeholder Engagement is the Way Forward
The fight against radicalisation to violence in gaming spaces is not limited to a single country; it is a global challenge that requires a global approach. The interconnected nature of online gaming platforms transcends national borders, meaning that P/CVE efforts must be internationally coordinated.
Multistakeholder collaboration is essential in this environment, particularly in the online space, where public, private and civil society actors must coordinate efforts to tackle the unique challenges posed by violent extremist actors. Effective P/CVE interventions in gaming spaces require robust partnerships and co-creation of approaches with technology and gaming companies, academia, civil society and gamers themselves. These partnerships are also crucial to developing the evidence base, including data collection and analysis, and translating research findings into actionable strategies.
Maintaining partnerships and established frameworks helps to mainstream mitigation efforts across different domains and sectors, including tech companies’ Trust and Safety teams, community moderation efforts, government policy development, and practitioner-level prevention and intervention strategies.
Furthermore, it is necessary that prevention and moderation policy continues its emphasis on the inclusion of a GBA+ approach — providing the funding, time and impetus for P/CVE practitioners and moderators to making gender mainstream and ensuring wider consideration of where inequalities based upon various identity factors can compound discrimination and harms. This also means that resources need to be committed at government level to ensure the same mainstreaming activities are taking place across policy, partnerships, regulatory and other related efforts.
Policymakers across the world can take stronger leadership roles in facilitating international dialogues and collaborative initiatives, bringing together governments, tech companies, CSOs and gamers themselves to develop and implement cohesive, cross-border strategies, as well as encourage a balanced view of recognising the potential harms and benefits of gaming spaces.
The project 'Examining Socialization with a Nexus to Radicalization Across Gaming (-Adjacent) Platforms Through a Gender Lens’ is funded by Public Safety Canada, led by RUSI and implemented by a consortium of members of the Extremism and Gaming Research Network.
The production of this resource was made possible through the work of the entire project team, including the listed authors, Galen Lamphere-Englund, Love Frankie, Alex Newhouse, Rachel Kowert, Alexandra Phalen, Ashton Kingdon, Linda Schlegel, Moonshot and Gonzalo Saiz.
Additionally, the authors would like to thank the individuals who dedicated their time and expertise to reviewing and editing this resource, including internal and external peer reviewers and the RUSI Publications team.
WRITTEN BY
Claudia Wallner
Research Fellow
Dr Jessica White
Acting Director of Terrorism and Conflict Studies
Terrorism and Conflict
Petra Regeni
Research Analyst and Project Officer
RUSI Europe
Footnotes
‘Preventing and countering violent extremism (P/CVE)’ is the chosen language for this policy brief. However, these types of interventions can also be referred to as ‘countering radicalisation to violence’ (CRV), as used by Public Safety Canada, ‘targeted violence prevention’, and others.
Natural language processing techniques were a key analytical component of this project, particularly in examining the vast amount of text data collected from gaming platforms and gaming-adjacent forums. For more information, see Jessica White et al., ‘Radicalisation through Gaming: The Role of Gendered Social Identity’, Whitehall Reports, 2-24, pp. 16–18, <https://rusi.org/explore-our-research/publications/whitehall-reports/radicalisation-through-gaming-role-gendered-social-identity>, accessed 27 January 2025.
Love Frankie, ‘Phase II: Gamer Violent Extremism (VE) Exposure and Resilience. Internal Survey Report’, May 2024.