By Allysa Czerwinsky, PhD candidate at the University of Manchester

As a researcher who spends much of her time in the online sphere, I’ve witnessed a stark rise in hateful and polarising content across the platforms I frequent for both academic fieldwork and personal entertainment. Alongside racist, anti-immigrant, and anti-vax sentiments within increasingly polarised digital spaces, misogynist hostility and male supremacist ideas have flourished within mainstream social media sites. Aggressive communication in the online sphere is far from new, but the ways in which specific terms, ideas, and language are used have direct consequences for the people they target, creating a differential experience of participation in online space alongside tangible impacts in the offline sphere. In the first of blog posts building on the chapters within our forthcoming Routledge Handbook on Conflict and Peace Communication, I explore the shifts in misogynist hostility in the online arena, using recent cases to illustrate the general nastiness of anti-women speech across platforms.  

Feminist scholars and activists have long traced the contours of misogynist hostility in the online sphere, highlighting its pervasiveness across mainstream social media platforms like Twitter/X, Facebook, Reddit, YouTube, and TikTok. Gender-based slurs of bitch, slut and whore, coupled with gender essentialist narratives positioning women as inferior, and graphic threats of physical and sexual violence, are an accepted reality for women and feminine-presenting people who participate in such online spaces. Misogynist hostility also incorporates concerted efforts, taking the form of collaborative attacks that are enabled and shaped by the affordances of the online environment. Perhaps the most widely known example of this was Gamergate, a harassment campaign against women game developers and media critics coordinated across platforms in 2014. A decade later, a similar coordinated harassment campaign has its sights set on a small Montreal-based game consultancy company, Sweet Baby Inc., with women and queer employees bearing the brunt of harassment across Steam, KiwiFarms, 4Chan’s /pol/ board, Twitter, and TikTok. Though Sweet Baby Inc.’s role is to ensure consistency in character dialogues and plot points rather than influence game development directly, disgruntled gamers have held the company responsible for including “woke” content in popular video games, such as the addition of Black women and queer characters in prominent story roles and as options for character selection. 

Male Supremacism and the Manosphere

Networked misogyny campaigns have been connected to the Internet manosphere, a loose collection of masculinist communities across online platforms united by shared concerns around masculinity and men’s rights (Czerwinsky, 2024; Ging, 2019). In recent years, mainstream and academic discussions of misogynist hostility throughout the manosphere have largely focused on misogynist incels, a vocal subset of heterosexual, cisgender men who self-identify as involuntary celibates. However, misogynist hostility underpins a variety of manosphere communities and groups, who each adopt their own alternative belief systems, catalogues of ‘scientific’ evidence, and glossaries of misogynistic community-specific terms to support their worldviews. This misogynist worldbuilding adds a sense of structure and legitimacy that extends beyond one-off uses of slurs and graphic threats: male supremacism is embedded in the ethos of these communities, providing users with an ideological basis for subjugation and dehumanisation, alongside the evidence and language needed to accomplish this in the online sphere. Across communities, glossaries of key terms and dedicated Wiki pages filled outline how pseudoscientific evidence supports guiding ideologies. This information is presented in a way that mirrors traditional forms of knowledge sharing in the online sphere, a marked departure from common perceptions of isolated abuse by online trolls.  

While manosphere communities have existed alongside the familiar anti-women speech seen within platforms like Facebook, Twitter, and YouTube for over a decade, the structured male supremacism that was once confined to fringe and alt-tech platforms like Kiwi Farms, 4/8Chan, Reddit, lesser-known blogs and vlogs, and independent community forums, has begun to seep into mainstream discussions. The rhetoric espoused by misogyny influencers like Andrew Tate in his Tate Speech Podcast and in videos shared on YouTube and TikTok shows striking similarities with the kind of aggressive speech used throughout the wider manosphere, contributing to the mainstreaming of male supremacist ideas beyond fringe spaces. 

Intersectional Hostility

At its core, misogynist hostility works to centre expectations of femininity that are implicitly coded as white, heterosexual, and cisgender, allowing aggressive communication to collude with racism, settler colonialism, anti-LGBTQ+ rhetoric, transphobia, ableism, and classism (Czerwinsky, 2024; Crenshaw, 2017; Collins, 2019).  Scholars and activists of colour have long documented that Black, brown, and Indigenous women and feminine-presenting people are subject to staggering amounts of abuse when compared to their white counterparts, particularly through specific language that combines racism and misogyny. Meghan Markle’s entry into and subsequent exit from the British Royal Family left her the target of racist and misogynist harassment across Tweets and tabloids, alongside a monetized hate campaign by multiple YouTube channels. These intersections work to create a differential experience of hostility for women of colour in the online sphere, who are positioned askance of acceptable (read: white) displays of femininity. 

Aggressive communication also intersects with anti-LGBTQ+ rhetoric, with both queer and cis women targeted by abuse that adopts queerphobic and femmephobic language. Most notably, the current rhetoric espoused by gender-critical ‘activists’ blends elements of transphobia and misogynist hostility, working to single out and subjugate trans women who fail to conform to the norms of their sex at birth and acceptable displays of cisgender femininity. Transmisogynist speech is often legitimised through concerns about the safety of cis women and children, an argument that positions trans women askance of acceptable notions of womanhood by conflating them with predatory men. For instance, former comedy writer Graham Linehan, who now rails against trans women on Substack, stated that fighting for trans rights and acceptance was a cover for “fetishists, conmen, and simply abusive misogynists” to invade women’s spaces. Similarly, once-beloved children’s author JK Rowling has shifted to spreading transmisogynist beliefs through Twitter, personal essays, and podcast appearances, using her platforms to promote the narrative that trans women are predatory men encroaching on women’s spaces. Transmisogyny also intersects with racist hostility, leaving Black and Latinx trans women subject to increasing vitriol and violence both online and off

While mainstream platforms have made strides in responding to misogynist hostility across platforms through stricter content policies and additional moderation efforts, misogynist messaging and male supremacist content still permeates across mainstream platforms. Moreover, the growing popularity of alt-tech platforms like Gab, Rumble, and Gettr, heralded as ‘free-speech havens’ by their users, offers a space for misogynist hostility and male supremacism to thrive. Increasing amounts of misogynist hostility across platforms have important consequences for how we choose to participate in and interact with others in online space. For instance, a recent survey of UK adults found that a fear of aggression and misogynist hostility in the online sphere left women significantly less likely to share their opinions, personal photos, or content in general. 

In our increasingly technosocial society, the ways in which we communicate about others in the online sphere are informed by – and directly inform – shifts in political, cultural, and social conversations offline, creating tangible impacts for those who are disproportionately targeted by misogynist hostility and male supremacism. 

Check out Allysa’s recent talk about her chapter as part of the HCPB event series here