By Emily Donovan

Young men and women are getting farther and farther apart politically, and it’s no surprise that social media plays a huge role in this. With a rise in self-proclaimed misogynists like Andrew Tate and incel culture infiltrating our everyday ideas about gender and sex, social media has slowly become a cesspool of harmful stereotypes, repeated over and over again. But not everyone is seeing the same thing. While some users might see content that uplifts feminism or positive masculinity, others, particularly teenage boys, are guided down a path where misogyny is rampant. These aren’t just random posts or isolated creators. This is algorithmic conditioning. It’s addictive & it’s altering how an entire generation of boys understand gender.
The argument I’ll be making in this article is that the real danger isn’t just that boys are watching misogynistic content, it’s that platforms like TikTok are actively cultivating these beliefs through algorithmic design. What was once a fringe ideology is now mainstream, thanks to the silent power of recommendation algorithms.
We’ve left our beliefs in the hands of algorithms, but algorithms aren’t interested in the state of our world or the consequences. They don’t care about fairness or equality, they care about engagement. And for teenage boys, this can be very dangerous. Within minutes of scrolling, they are being swept into a spiral of misogynistic content, most of it dressed up as self-help or “male motivation”. It’s not an accident. It’s by design. And it’s widening the gender gap, one FYP at a time.
Misogyny is being sold as empowerment. And the scariest part? It’s gone viral.
The Algorithmic Funnel
Imagine you are a 14-year-old boy, being fed video after video that says things like “Women belong in the home and are a man’s property” or “The idea that women were oppressed throughout history is an appalling theory.” These are real statements made by online influencers like Andrew Tate and Jordan Peterson, and they are not difficult to find.
We can’t necessarily blame these boys for taking in these messages. It’s a narrative that is both algorithmically and intentionally designed to make them feel like kings. Why would they want to turn away from that?

TikTok & YouTube Shorts: A Case Study
The rise of algorithm-driven social media has taken an ancient problem and given it a new, hyper-viral edge. TikTok, perhaps the largest platform in this space for young users, is central to the story. We’ve entered a moment where ideas once confined to the darkest corners of internet forums are now being served to kids on the For You Page, an experience that’s not random at all. It’s a highly engineered system designed to maximise watch time and emotional reaction.
A 2024 study by Dublin City University’s Anti-Bullying Centre titled Recommending Toxicity tracked TikTok and YouTube Shorts content recommended to accounts that were modeled after teenage boys. It found that within 2.6 minutes, TikTok started recommending misogynistic, alpha-male content. And within 30 minutes these accounts were blasted with manosphere narratives (Baker, Ging & Andreasen, 2024).
This is not just edgy content. It’s the backbone of a digital ecosystem reinforcing one-sided, often dangerous ideologies. What we’re seeing here is not coincidental and it’s certainly not neutral. It’s the result of algorithmic governance, a new kind of power that shapes how boys see women and how they see themselves.

Algorithms Don’t Reflect Us—They Build Us
Algorithms don’t just show us what’s popular; they decide what becomes popular. As Just and Latzer (2019) explain, algorithms perform “reality construction.” They select, filter, and amplify information, shaping what users perceive as true, important, or normal. When teenage boys see women consistently portrayed as deceitful, inferior, or superficial, it becomes their new baseline of belief.
This is the quiet poison of the algorithm, it doesn’t shout; it whispers, over and over, until it’s the only voice in the room.
Virginia Eubanks (2018) also reiterates this point by explaining that digital systems don’t just reflect inequality, they automate it. In Automating Inequality, she shows how high-tech tools reinforce existing social hierarchies and deepen marginalisation. In this case, platforms are not simply distributing content, they are reinforcing gendered power structures by embedding misogynistic values into the very logic of content delivery. For vulnerable users like teenage boys, this means absorbing narratives that were once fringe as if they are fact.
Profitable Vulnerability
Adolescence is a formative period—boys at 14 or 15 are developing identities, seeking validation, and questioning authority. TikTok steps in as a kind of digital mentor, but instead of providing healthy models, it pushes narratives that portray men as victims and women as threats.
Kate Crawford (2021) reminds us that AI systems are “embedded in histories and structures of power.” Algorithms are not neutral, they are coded with bias and built to exploit vulnerability. Platforms take advantage of boys’ insecurities and maximize their screen time with content that tells them they’re owed dominance.

From the Algorithm to the Manosphere
This content feeds into the broader “manosphere, a toxic blend of incels, pick-up artists, and men’s rights activists. These groups share one core belief: masculinity is under attack, and women are to blame. While social media didn’t create the manosphere, it made it viral.
Frank Pasquale (2015) describes how “black box” algorithms operate without transparency or accountability. They don’t promote based on truth or ethics, they push what performs best. And what performs? Extremism.
The Widening Gender Divide
A 2024 article in the Financial Times reports that in the U.S., women aged 18 to 30 are now much more liberal than men who are the same age (Burn-Murdoch, 2024). Similar gender divides are appearing globally. Social media platforms and their algorithmic ecosystems are helping drive this wedge deeper.
TikTok’s For You Page isn’t just entertainment, it’s ideological programming. The more boys consume content that glorifies male entitlement, the more they’re nudged toward a worldview that pits them against women.
Even educators have noticed this. A recent survey of teachers in South Australia revealed a disturbing rise in sexist and aggressive behavior by male students, including the use of misogynistic memes, sexualized gestures, and intimidation of female staff (Schulz, 2024).
TikTok’s Defense (and Its Failures)
TikTok claims it bans hateful ideologies, but enforcement is inconsistent. Harmful content often hides behind humor, emojis, or the guise of “self-improvement.” Even banned creators, like Andrew Tate, resurface through reposts and fan accounts.
Terry Flew (2021) argues that platform regulation is caught between protecting free speech and addressing real harm. TikTok’s business model depends on high engagement, and few things work like controversy.
Harvard Business Review’s article “AI Regulation is Coming” (Candelon et al., 2021) shows that the EU is making real strides with the Digital Services Act, which demands transparency and risk mitigation. The U.S. remains gridlocked, stalled by Big Tech lobbying and free speech debates.
Pasquale (2015) emphasizes the need for algorithmic accountability. We need audits, disclosures, and user rights to challenge biased content systems.

So What Now?
This all sounds pretty bad, but it doesn’t have to be. There’s no one magic fix, but there are things we can start doing right now to slow this down and create better online spaces for boys (and everyone else too).
1. Teach boys how algorithms actually work.
Most teens don’t realise that their feed isn’t random. The videos they see aren’t showing up by accident. We need to make digital literacy part of everyday learning, not just how to avoid scams or use Google, but how these platforms are built to keep you watching, even if the content is toxic. Boys should know when they’re being manipulated.
2. Make platforms explain themselves.
Right now, platforms like TikTok don’t have to tell us anything about how their recommendation systems work. That needs to change. If we want to hold them accountable, we need to know why certain content is being pushed, especially to kids. The EU is already making moves on this. It’s time countries like Australia and the US caught up.
3. Change what gets rewarded.
The reason this content keeps showing up is simple: it performs well. The algorithm is built to reward what people click on, not what’s helpful or healthy. But platforms could change that, they could prioritise stuff that builds understanding instead of outrage. Right now, they just don’t want to. That’s where regulation comes in.
4. Boost better role models.
There are actually a lot of creators out there trying to model a more positive version of masculinity, ones that talk about respect, kindness, confidence, and real growth. The problem is, they’re getting buried under the noise. If platforms gave more visibility to them, we’d be showing boys that there’s more than just the “alpha male” way to exist online.
5. Don’t forget the offline stuff.
At the end of the day, this isn’t just about TikTok. It’s about how we raise boys, how we talk about masculinity, and whether we give them space to ask questions without being laughed at or shut down. Social media is powerful, yes, but it’s not everything. Cultural change starts way before anyone downloads the app.
But what about free speech?
This always comes up, and fair enough, no one wants the internet to be completely policed. But this isn’t about banning opinions. It’s about making sure platforms don’t actively profit off content that’s clearly harmful. If you’re feeding teenagers a steady stream of videos that say women are inferior and men are victims, and you know it, you can’t hide behind “free speech” anymore. That’s just bad business.
Conclusion: The Final Swipe
So here we are, in a world where a 14-year-old boy can open an app and, within minutes, be told that women are the enemy and that he’s better than them by default. Not because he went looking for it, but because the algorithm handed it to him on a silver platter. That’s not just creepy, it’s dangerous.
The scariest part? It’s working. These platforms aren’t just reflecting culture, they’re shaping it. One For You Page at a time, they’re reinforcing harmful narratives and driving a deeper wedge between young men and women. And while it might feel like this is all too big or too far gone, it’s not. The tools to push back are right in front of us, education, regulation, better platform design, and honest conversations about masculinity, identity, and belonging.
Teenage boys don’t need another villain arc. They need adults, parents, teachers, creators, and most importantly policymakers, who actually care enough to intervene. Because if we don’t challenge the system that’s feeding them this stuff, we’re not just failing them. We’re setting up the next generation to take on a world even more divided than the one we have now.
It’s time to stop pretending the algorithm is neutral. It’s not. And until we do something about it, the divide will only keep growing, swipe after swipe.
References
Baker, C., Ging, D., & Andreasen, M. B. (2024). Recommending toxicity: The role of algorithmic recommender functions on YouTube Shorts and TikTok in promoting male supremacist influencers. DCU Anti-Bullying Centre. https://antibullyingcentre.ie/publication/recommending-toxicity-the-role-of-algorithmic-recommender-functions-on-youtube-shorts-and-tiktok-in-promoting-male-supremacist-influencers/
Burn-Murdoch, J. (2024, January 26). A new global gender divide is emerging. Financial Times. https://www.ft.com/content/29fd9b5c-2f35-41bf-9d4c-994db4e12998
Crawford, K. (2021). The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
Flew, T. (2021). Regulating Platforms. Cambridge: Polity.
Just, N., & Latzer, M. (2019). Governance by algorithms: Reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157
Pasquale, F. (2015). The Black Box Society: The Secret Algorithms that Control Money and Information. Harvard University Press.
Schulz, S. (2024, May 1). ‘Make me a sandwich’: our survey’s disturbing picture of how some boys treat their teachers. The Conversation. https://theconversation.com/make-me-a-sandwich-our-surveys-disturbing-picture-of-how-some-boys-treat-their-teachers-228891
Candelon, F., Charme di Carlo, R., De Bondt, M., & Evgeniou, T. (2021, September). AI regulation is coming. Harvard Business Review. https://hbr.org/2021/09/ai-regulation-is-coming
Be the first to comment