From Social Events to Memes: Platform Algorithms are Paralyzing Our Perception

Imagine that: In a causal afternoon, you scroll through social media, and suddenly a sensational breaking news story catches your eye – a major event is unfolding globally. At first, you are shocked like everyone else who see the news. The comment section is filled with serious discussions and concerned voices, and you can’t help but share and comment. For the first day or two, you keep refreshing to see the latest updates. But after a week or two, you only think of it occasionally, and when you search for it, you find that the homepage is dominated by a celebrity gossip topic.

A month or two later, you see a new and interesting word in the bullet comments or discussions, being used to describe a certain scene. Suddenly, you realize this is the key word from that news story. But you can no longer recall the sadness, anger, and shock it once brought you. Now, you only feel amused and curious. You start casually throwing this word around in conversations and sharing its funny points with friends. As you use it more and more frequently, that once heart-wrenching event has become just a trendy symbol, a small label for you to show you’re up-to-date. The emotional experience from back then has long been forgotten in the endless scrolling and sharing.

One day, you suddenly want to know what happened to that event in the end. Therefore, you search for its key word on the platform. But what pops up are mostly jokes, memes, and short videos adapted from it. The real news reports are buried under a pile of amazing twists and funny interpretations. Finally, you find the final outcome in a long article from a less popular account. It was resolved, but there are very few comments showing concern for the event. You feel sad at that moment, but you are immediately drawn away by the next round of hot searches and entertainment topics, leaving that page and never searching for it again.

This is how the platform algorithms work. They weaken our sensitivity to pain through fast food content, reducing heavy events to just shallow entertainment surfaces.

How Do Algorithms Make Events “Lose Weight”?

The platform doesn’t directly tell you what you should watch, but it quietly decides what content is worth seeing. This decision is not based on the importance or social significance of the content itself, but on the data: likes, retweets, comments, viewing time, which together form the algorithm’s criteria for judging the content. On social media, if a message want to go viral, it must first conform to the preferences of the platform’s algorithms: short, emotional, easy to understand, easy to repost and, ideally, generate punchlines or controversy. As a result, social events that should originally be painful, complex, and require slow reading and slow thinking are constantly being processed into consumable content adapted to the platform. They are cut into 10-second clips, paired with popular BGM, with eye-catching subtitles, condensed into an empathetic line, or a screenshot with enough memory points. Only in this way, these news can grab a place in the flood of information and earn a little exposure.

As Just and Latzer (2016) argue, algorithmic selection is not a neutral process, but a reality construction: Platforms continue to reinforce certain types of content based on behavioral data such as click-through rates and interactions, pushing public events “as users want to see them” rather than as they are (Just & Latzer, 2016). In other words, the more users prefer to entertain the expression, the more the platform will push these “lightweight” information, and eventually form a self-cycle.

The more users prefer short, fast expressions, the more the platform will push to them. The more often users like social issues that have been treated with ridicule, the more it thinks “this is the social care users want”. Over time, the thickness of the incident is polished, the emotional edges are smoothed out, the pain becomes lighter, and the anger becomes comedic.

When Pain also Becomes Likable

If the visual symbolization of events makes us remember the picture but forget the meaning, then the “fast-food” of emotions makes us feel pain without ever really understanding it. Platform algorithms not only filter what we watch, but also guide how we feel, how long we feel, and how much we feel is enough.

High emotional intensity content on social media is naturally favored by algorithms. A short video that makes people angry, a rant about an injustice, or a photo that takes their breath away is more likely to get likes, comments, and retweets. This mechanism constitutes an emotion recommendation system, constantly pushing stimulating emotional clips to us, so that we are addicted to that pleasurable emotional consumption.

In the Automated Culture, Andrejevic (2019) points out that platforms are not just recommending content, but reconstructing the way our emotions are expressed and flowed.. This emotional cycle does not encourage depth, but seeks immediate pleasure and immediate reaction.

We may be moved to tears in this video, but will soon be amused by the next clip of a cat dancing. We participate in being moved and forget in being moved. Pain is no longer a starting point for action, but a feeling of being consumed.

From Participation to “Sense of Participation” : the Entertaining of Activism

Whenever a public issue is trending, hashtags like “# SupportXX” will quickly pop up, and people are joining the ranks of digital action. But the question is, how deep is that involvement? Are we really driving change, or are we just being guided by algorithms to perform a show of participation?

Flew pointed out that commercial platforms do not care about whether the content itself is meaningful or deep, but whether it can attract attention and drive user behavior (Flew, 2021). User behavior data becomes the standard by which platforms evaluate the value of content, rather than the ethical weight of the content itself. This makes support less like a stance and more like a socially engaged action.

Case Study#1: White Paper Protest – when political resistance becomes a recreational usage

The White Paper protests originally took place on the streets of Chinese cities in late 2022 as a result of discontent over lockdown policies triggered by the Urumqi fires. Unable to voice their demands publicly, protesters chose to hold up a blank piece of A4 paper as a symbol of silent opposition. This piece of paper quickly became a symbol of resistance under political repression, conveying the will “I cannot speak, but I must express”. The protest was a symbolic revolt by the youth against censorship and repression, and its organizational and symbolic power attracted international media attention (Moss, 2024).

However, under the action of the platform’s visual algorithm and social mechanism, this behavior with the meaning of resistance soon experienced context drift. On platforms such as TikTok, RedNote and weibo, users have started to pose for photos and do filter challenges with the gesture. Some fans even Photoshop white paper to protest their idols, captioning “How come my idol is not trending?” “Refuse your cold treatment of them”, using this defiant posture to express an attitude towards entertainment issues.

In order to adapt to the communication mechanism of the platform, a serious social symbol is transformed into a performance gesture that is easy to understand, easy to imitate and easy to create. It lost its original defiant character and was repackaged as a visual pop symbol.

  • Case Study#2: PUA language – from accusatory tool to memetic material

PUA, pickup artist, originally refers to a manipulative love technique, and later in the Chinese Internet context, it gradually refers to emotional manipulation and mental violence. It used to be an important keyword to expose hidden harm and alert the public, and was used to describe psychological oppression, demeaning and controlling behaviors in intimate relationships (Freeman, 2014), carrying the social demands of gender justice and emotional autonomy.

However, following a slip of the tongue in a celebrity interview, the word began its metamorphosis into entertainment. When actress Xu Di mistakenly said “PUA” as “CPU” in the program, people instantly played meme relay, and formatted sentence patterns such as “he is in CPU you”, “she is in SUV you” and “they are in KFC you” quickly spread on social platforms, creating a variety of three-letter abbreviations to imitate. The language of the original complaint of injury, in the group play meme and algorithm recommendation, has been turned into an abbreviation game. Under the platform’s entertainment mechanism, this kind of vocabulary that should trigger alarm is de-seriousification in laughter.

Under the guidance of platform algorithms, more and more public expressions that were originally full of tension and pain are compressed into light and reportable symbols, just as white paper protests are embellished into minimalist visual symbols, just as PUA complaints are morphed into “CPU” jokes. With the blessing of algorithmic preferences, only the content that looks good, understands and copies is easy to spread. We thought we were participating, but we were actually just being sucked into a data-driven cycle of content production.

How do We Become Co-conspirators?

Between likes, comments and retweets, we seem to be passive recipients of information. But in fact, the platform algorithm is not a one-way feed of content, it is a feedback system. Every action taken by the user, in turn, feeds the algorithm, reinforcing the probability that a certain type of content will be recommended. We are not only the audience of information, but also unconsciously shape the logic of information transmission.

Pasquale pointed out that the power of ordering is the power to decide what is remembered and what is forgotten (Pasquale, 2015). Platform builds some kind of platform memory through the control of information ordering and exposure, and it is precisely through participation that we repeatedly strengthen this memory system.

We don’t always retweet, copy, or tease out of malice. More often, it’s because that engagement gives us a lot of social presence. The platform recognizes these behaviors as signals, and the algorithm then pushes similar content again. As Pasquale (2015) points out, these rules encoded in the black box actually carry a specific value ranking. What we think we are expressing is actually fulfilling the behavior path set by the platform.

What Can we Do?

What else can we do in the face of an algorithm-driven and information-exploding platform environment? How do we retain the pain of events, the understanding of reality in this mechanism?

Perhaps the answer is not to completely leave the platform or refuse short content such as extreme measures, which is absolutely not realistic and can not be achieved, we can do is to try to regain our attention sovereignty.

We can start with smaller actions. For example, when you brush up on an event, don’t rush to retweet or comment on it, but take a moment to understand its causes and consequences. When you see a “pain refinement” video, identify it is a message, or some kind of designed emotional template? Does it move you because it is true, or because it happens to be set to familiar music?

In 2020, after the “Death of George Floyd”, there were more and more protests against images being glamorized, formatted, and even turned into social profile pictures and background templates. In response, a group of users launched the #NotYourMeme campaign against the visual adornment of tragedies. They point out that images of Freud, Briona Taylor and others are being consumed as traffic tools and even as a backdrop for performative activism. As Kay Hollins wrote in her blog: “Breonna Taylor is not an Instagram caption… not a trendy topic for performative allyship” (Hollins, 2020). This action reminds us that expression is not a substitute for participation, and popularity is not equal to caring.

We can’t immediately change the underlying logic of the platform, but we can decide whether we want to continue to wallow in the paralyzing mechanism.

The next time you brush up on a social event that’s being teased, or see a symbol that’s too familiar to be true, maybe you can just do one little thing –

Pause for a second and ask yourself: Do I remember what it was like?

Reference List

Andrejevic, M. (2019). Automated culture. In Automated Media (pp. 44–72). Routledge.

Flew, T. (2021). Regulating platforms. In Digital Media and Society (pp. 81–87). Polity Press.

Freeman, H. (2014). Women, beware this PUA army of sleazebags, saddos and weirdos. The Guardian. https://www.theguardian.com/commentisfree/2014/nov/12/pua-pick-up-artists-julien-blanc-dapper-laughs.

Hollins, K. (2020). #NotYourMeme: How the memeification of Black women’s plight does more harm than good. Please, No BS. https://www.pleasenobs.com/blog/notyourmeme-how-the-memeification-of-black-womens-plight-does-more-harm-than-good.

Just, N., & Latzer, M. (2016). Governance by algorithms: Reality construction by algorithmic selection on the internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157

Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.

Moss, A. (2024). China students protest censorship with blank paper demonstrations (2022). Global Nonviolent Action Database. https://nvdatabase.swarthmore.edu/node/5111.

Be the first to comment

Leave a Reply