Regulating hate speech against women : challenges and perspectives in protecting women streamers on Twitch


Introduction

“Shut up bitch”, “Feminazi”, “You deserve rape dressed like that”, “How many did you blow to get there” – these are examples of the violence suffered by 38% and witnessed by 85% of women on the Internet (The Economist Intelligence Unit, 2021). While the Internet and the emergence of numerous online platforms have opened up new possibilities for expression and identity-building, they have also amplified the spread of hate speech, particularly against women (European Parliament, Directorate-General for Internal Policies of the Union, Wilk, A., 2018). Such speech is an integral part of gender-based cyber-violence, alongside other practices such as doxxing, revenge porn and stalking. This phenomenon has significant societal repercussions, and is constantly evolving with the advent of new technologies (Hicks, 2021). This raises important questions about the regulations needed to protect women online, especially on platforms where sexism is particularly prevalent, such as Twitch, recently rocked by several scandals linked to harassment suffered and denounced by numerous female streamers.

Understanding hate speech targeting women : an overview

Sexist hate speech is defined by the Council of Europe in its report “Combating sexist hate speech” as “one of the expressions of sexism, which can be defined as any assumption, opinion, assertion, gesture or behavior intended to express contempt for a person on the grounds of their sex or gender, or to consider them as inferior or essentially reduced to their sexual dimension. Sexist hate speech includes expressions that propagate, incite, promote or justify hatred based on sex » (Stratégie du Conseil de l’Europe pour l’égalité entre les femmes et les hommes, 2016). Despite the recent increase in its proliferation within virtual spaces, it is not confined to the Internet or social networks. It has its roots in a society still imbued with a patriarchal ideology, where sexism is present in many aspects of everyday life (Ballet, 2023). Practices such as stalking and revenge porn, which form an integral part of the cyber-violence suffered by women, are thus seen as extensions of domestic violence enabled by technology (European Parliament, Directorate-General for Internal Policies of the Union, Wilk, A., 2018). These online behaviors have tangible repercussions in the real world, sometimes even influencing acts of violence. Indeed, the proliferation of masculinist ideologies as embodied by incels and public personalities such as Andrew Tate have repercussions both on the occurrence of feminicides and on women’s mental health. For example, on May 23, 2014, Eliott Rodger announced in a Youtube video, “I will slaughter every single spoiled, stuck-up blonde slut I see » (The New York Times, 2019). 24 hours later, he had killed six people before committing suicide (Pottier, 2014). More recently, Andrew Tate, a self-proclaimed mysogin with over 9 million followers on Twitter, made comments such as “I think the women belong to the man”, questioned the responsibility of rape victims and indulged in countless slut-shaming declarations, all while sporting the biography “Human Trafficker because I told friends how to post on TikTok. Rapist because some girl remembered from 15 years ago once I became rich”, referring to the accusations of sexual assault and human trafficking he and his brother have been facing since 2022 (X, n.d.)(LIBERATION & AFP, 2023). The psychological consequences of this outspoken and increasingly violent mysogyny for victims are severe, with low self-esteem, high levels of stress and anxiety. According to Amnesty International, more than half of women who have suffered online abuse or harassment have experienced a drop in self-esteem or self-confidence, stress, anxiety or panic attacks (Amnesty International, 2017). Public figures are particularly vulnerable to such violence, especially when they address feminist issues. Unfortunately, this often leads them to withdraw or self-censor to preserve their mental health and safety. This reaction unfortunately reinforces male domination in contexts already marked by a high degree of sexism (Ballet, 2023). Such online discourse is exacerbated by a variety of factors, including anonymity (Lapidot-Lefler & Barak, 2012), the challenges of moderation (Gillespie, 2016) and the way algorithms work (Tomlinson, 2023). In addition, the advent of new technologies offers perpetrators of cyber-violence the opportunity to hijack these innovations for malicious purposes, further compounding the severity and scope of acts of cyber-violence. For example, the deepfake, defined by the Oxford English Dictionary as “any of various media, esp. a video, that has been digitally manipulated to replace one person’s likeness convincingly with that of another, often used maliciously to show someone doing something that he or she did not do”, has recently been used to defame various personalities, mainly through photo and video montages of a pornographic nature (Oxford English Dictionary, 2023). The issue of artificial intelligence therefore also raises concerns about its role in the spread of online hate speech targeting women.

The Twitch streamers’ dilemma : gendered hate speech in gaming

While hatred of women is generally visible on the Internet, it is all the more prevalent in certain contexts such as the gaming industry, a professional milieu characterized by an under-representation of women. Indeed, although the practice of video games is evolving in favor of relative parity (47% of French female gamers are regulars, compared with 65% for men, according to IFOP), “It’s [above all] important to see how they are accepted within the community”, as Enora Lanoë-Danel, IFOP research manager, points out (Chamoiseau, 2023)(Subileau, 2023). Indeed, 53% of female players have already experienced or witnessed differential treatment on the grounds of gender (refusal to play/talk with a woman, paternalistic and condescending attitude, sexist criticism of level of play) and/or verbal aggression and threats (remarks about physical appearance, obscene remarks or comments with sexual connotations, sexist remarks/insults/curses, threats of sexual assault)(Chamoiseau, 2023). Added to this are the reflections that players may be subjected to when talking about their passion to those close to them (Subileau, 2023). These behaviors have psychological repercussions on players (feelings of illegitimacy, anxiety, etc.) and also force them to adopt avoidance behaviors. Indeed, this is the case for 40% of female gamers, a figure that rises to 66% for female fighting game players (Chamoiseau, 2023). Avoidance strategies include refusing to take part in voice chat, leaving a group or game, avoiding online games or even gender concealment. So this is what cyber-violence against women in the video game world looks like. However, it should be noted that this phenomenon is based on a kind of paradox. The gamer and geek communities are themselves marginalized. So to be singled out for marginalizing others can provoke a certain form of cognitive dissonance (Massanari, 2016). It’s important to denounce this problem of sexist hate speech in gaming and try to understand its ins and outs, while not reducing this issue to simple amalgams about geeks. On the other hand, gamers must also become aware of this issue, take responsibility for it and work towards greater inclusion of women in the video game world, and in general for the development of a healthier environment. 

These same issues can be found on Twitch, a live video streaming platform launched in 2011. Originally, it focused primarily on video games, allowing gamers to live stream their gaming sessions and viewers to watch and interact with streamers in real time. Over the years, however, Twitch has expanded to include a wide variety of live content, including music streams, art creation, cooking, discussion and much more. In terms of statistics, Twitch is one of the world’s most popular streaming platforms. In 2023, it had over 7 million active streamers per month and around 2.45 million simultaneous viewers (Gautier, 2023)(Twitch Tracker, 2024). Twitch has also become an important venue for live events, such as video game tournaments, virtual concerts and live broadcasts of sporting events.

On October 24, 2022, French streamer Maghla, who has been active on Twitch since 2017 and currently has over 900,000 followers on the streaming platform, shared a thread on Twitter denouncing the cyber-violence she suffers on a daily basis, at the same time freeing up the word of French-speaking streamers on a problem hitherto underestimated by many. 


Her thread is divided into 5 main sections: Reddit, forums, discord, private messages and lives. She begins by pointing out that, despite the fact that she mostly dresses in oversized clothes, the only time she shows her body at all is on reddit pages dedicated to sexual roleplays on her, accompanied by pronographic montages. The forums, meanwhile, are filled with posts of people masturbating to her lives/photos and pornographic deepfakes in a totally uninhibited way, all associated with sexual and sexist statements, including numerous references to rape. The same content is shared on Discord. She also denounces the unsolicited sexual photos sent to her and other similar messages. Finally, she talks about her lives and the large number of bans linked to sexual and sexist remarks whenever her body is even slightly visible. Finally, she talks about the impact of these behaviors on her mental health, her exhaustion and her general fed-upness with the situation. The thread prompted strong reactions from other streamers, who also denounced their harassers, and a wave of support from the platform’s French-speaking streamers (Maghla, 2022). Streamer Shironamie has published a video that will particularly shock Internet users, as it bears witness to the violence of the harassment suffered by female streamers on Twitch. 

The tweet accompanying the video describes the context: “MESSAGE OF PREVENTION. Today I was live, as I am every day, until I got a call from a stranger. He pretended to be a delivery man and managed to get my address. Unfortunately for him, he didn’t hide his number. This was followed by a threatening call: “. For over 2 minutes, the individual, perfectly aware that he was being broadcast live and that his words were being recorded, threatened her in an extremely violent and explicit manner: “Do you know what we do in Israel to snitches? We behead them and rape them. I’m mentally ill. It excites me when you’re worried like that, seeing you suffer gives me a hard-on. […] I’ve been planning my attack on you for several weeks now and in a few weeks I’ll have my plan for raping you. I’ll know the rhythm at which you leave your house, when you take your shower, I’ll go back to your place […] I’ll masturbate in front of you if I have to, but you won’t catch me. […] If you call the cops, I’ll rape you, but whatever you do, I’ll rape you » (Shironamie, 2022). This excerpt and the evidence presented in Maghla’s thread attest to the seriousness of the stalkers’ acts, which they do not hesitate to commit in full view of the public, without any fear of punishment. Over and above the education of these individuals regarding such sexist behavior to the point where it is illegal, this testifies to a clear failure on the part of the platform, but also of the justice system in terms of appropriate sanctions against harassers and support for female streamers. Maghla ironically mentions in her thread that it would be enough to ignore the harassers for them to stop on their own, which is not the case in reality in view of the forums that continue to be fed every day. There’s a real problem of governance here, because it’s not normal for offenses of such magnitude and visibility to take place with impunity, and for streamers to be told to simply ignore violence of this severity. What’s more, streamers expressing their support and declaring that they didn’t appreciate the extent of the harassment suffered by female streamers attests to a real lack of education on the subject. What’s more, it’s all the more worrying that such behavior is not confined to the virtual world, but has repercussions on the lives of streamers such as BagheraJones, a Swiss streamer who had to move house “because of harassment and ‘visits’ to her apartment”, she declares: “I developed enormous anxieties and my insomnia became dreadful » (Terriennes & Charrier, 2022). This clearly shows that online sexist hate speech has a direct impact on the lives of the platform’s content creators, whose mental health is jeopardized and who are forced to adopt habits such as, at the very least, concealing their bodies or being constantly vigilant about the personal information they divulge, precautions that are often not enough in the face of individuals who stalk them and do their utmost to harm them through stalking, insults, threats, deepfakes and all manner of cyberviolence at its worst. 

Analyzing current policies : addressing hate speech against women on Twitch

In December 2020, Twitch published the article “New rules on hateful behavior and harassment” presenting, as the name suggests, the new measures put in place to combat harassment, hateful behavior and sexual harassment. Based on the observation that “many people on Twitch, particularly women, members of the LGBTQIA+ community, black people, people of color and aboriginal people, are more victims of online harassment, including on our platform”, the platform emphasizes that “these changes are intended to better protect the community, not to be punitive”, underlining the central debate on moderation between governance and respect for freedom of expression (Twitch, 2020) (Gillespie, 2016). Enhanced 2 years later by a second article “an update on our measures to eradicate targeted acts of hate”, Twitch highlights the progress made following “hate raids”, waves of attacks by malicious bots, which took place in 2021 (Twitch, 2022). The platform had indeed committed to strengthening its policy on hateful behavior and harassment, with more specific prohibitions (comments on physical appearance, whether positive or not, computer attacks, etc.), to improving its moderation and security tools (notably with the development of new technologies such as AI), to providing educational and support tools for streamers, and to taking legal action against harassers. The tone and purpose of these 2 articles testify to a desire for transparency and responsibility, reinforced by the feeling of proximity and attentive listening to users induced by the mention of the Twitch UserVoice feedback forum, where users can put forward various suggestions on different themes relating to the management of the platform (Twitch, n.d.). 

From a point of view external to the latter, in 2022 Twitch adhered to the EU code of conduct against illegal online hate speech alongside Facebook, Twitter, Youtube, TikTok, and 7 other partners (Représentation en France, 2022). Launched in 2016, this code of conduct makes it incumbent on adherents “to facilitate the reporting of illegal hate speech by their users and to cooperate with civil society organizations and national authorities to this end » (Lequeux, 2022). 

In the wake of various scandals and a growing sense of insecurity, streamers themselves have taken steps to remedy the problem of sexist hate speech on Twitch. Tools and channels have been put in place to share information about problematic viewers, with a view to blacklisting them. In addition, charity events have been organized such as the Furax Charity Marathon, “a weekend of streaming to help victims of sexist and sexual violence, and online harassment”. This year’s event took place on April 5, 6 and 7, and raised nearly €150,000 for the association Féministes Contre le Cyberharcèlement (Furax, 2024). 

These measures are far from sufficient, but Twitch is aware of this and intends to “persevere in ways that have proven successful […] with an emphasis on simplicity of use and less restrictive to your overall Twitch experience”. Given that the problem is rooted in the sexism prevalent in our society, perhaps it would also be a good idea for governmental institutions to put in place mechanisms to raise awareness of sexist and sexual violence, and to implement effective legal measures, especially when it comes to demonstrations of violence as explicit as those shown in this article. 

Conclusion

Online sexist hate speech takes many forms, and is becoming increasingly violent as masculinist rhetoric spreads and new technologies emerge. However, despite the scale of hate speech, victims’ voices are also being heard in these virtual spaces. It is crucial to find effective solutions to guarantee a healthy environment, particularly in professional fields such as gaming, which is largely dominated by men. The question of governance responsibility remains complex and worrying. For the time being, it seems difficult to envisage significant progress, especially when platforms like Twitch seem more inclined to sexualize and censor women than to protect them from the sexist hate speech they are often the target of.


References

Be the first to comment

Leave a Reply