Every day, we scrolling through endless digital platforms like TikToks, instagram, X, Snapchat and so on. There are so many informations seem really have a strong relavance with our daily life. Although it is convinence for our daily life, it still puts our privacy at risk of being leaked. Imagine if the algorithm were a real person who talk what you want to know everyday, is that horrible? Aside from that, if all your information is in someone’s hands and they just promise that ‘it won’t get out’, can you trust them? What is true and where are we headed? To the hell or paradise?
We like to believe we are in control of ourselves, but is that true? Nowadays more and more AI agents are used to write daily news, filter the job application and to recommend videos to meet your need of entertaiment. You are seeing what you want to see or what AI (or some owner of AI) want you to see? Big data and AI are shaping culture, oppotunity and belief of people. Like what Noble, Safiya(2018) critically examines the oppression of algorithms. Especially talking about how Google prepetuate and amplify systemic biases, especially against women of color. Noble illustrates that the search refelected by algorithem is injustice. It reinforce the steretype of the societal prejudices. For example, when you search for token like ‘Black girls‘, it jumped out with racist and sexist contents. These algorithmically generated search results will influence people’s value judgment imperceptibly.
Research on TikTok
However, algorithms oppresion doesn’t only happened in Google, but also in TikTok. TikTok is a big platform with millions of users and it has redefined how we live. People like to use TikTok to find friends, browse funny videos and buy things for our daily life. It convinence our life, though, it might change our life.
Mark Andrejevic(2019) claims that when digital monitoring have a strong relavance to algorithmic decisionmaking and machine learning, it creates new types of power and create democratic forms of accountablility and autonomy alike. These power and challenges lead to an’ bias of automation’, which influence pepole’s life invisibly.
Algorithms often masquerade as close friends in order to get you to indulge in the content they recommend. TikTok, too, has algorithmic mechanisms that keep people viewing short video content. The recommendation mode and attraction mode of this algorithm are very similar to the story of Alice in Wonderland, which attracts Alice to go to the rabbit hole of another fantasy kingdom, so this theory is also called the “rabbit hole theory”.
Because of this, the European Commissioners had started an investigation into TikTok in 2024. In April 2024, they opened an investigation into the launch of TikTok Lite in France and Spain, focusing on the potential rabbit hole effect on minors. At the time of the investigation Tiktok agreed to permanently withdraw the TikTok Lite rewards program. However, the rest of the findings have not been fully publicized at this time.
Not only that, but TikTok continues to be influencing today’s teens with its unique algorithms. According to research, TikTok’s recommendation feature is gradually encouraging self-harm and suicidal behavior. In particular, on the recommended videos page, the algorithm first identifies teens with underlying mental health issues before exacerbating teens’ depression anxiety and self-harm issues with videos in the depression, self-harm and suicide romanticization categories.
Overall, it seems as if algorithms have been influencing the way people live their lives and their habits. This is inseparable from two things. The first point is that people have gradually become accustomed to browsing short videos every day to kill time in this era of media integration. It is because of the high incidence of video browsing that algorithms are able to utilize their ability to push the appropriate content to the viewers. The second point is the decrease of reading. In the era of short videos, people no longer have the ability to distinguish between right and wrong on their own, but are more inclined to quickly browse through the video to find the answer. This undoubtedly gives algorithms a chance to capitalize on this.
In today’s extremely fragmented information, the algorithm-driven short video culture is quietly reshaping the structure of the human brain. Nicholas Carr pointed out in “The Shallows:What the Internet Is Doing to Our Brains” that the “fast-food information intake” brought by the Internet has made us accustomed to shallow reading, and gradually lost the patience and ability for complex thinking. Especially in short videos, we tend to make a judgment on whether to watch or not within a few seconds, and this “fast switching” behavioral pattern strengthens the neural circuit of instant gratification, making human attention more and more short-lived, and the “long exposition” and “logical construction” are not as important as they used to be. This pattern of “fast switching” reinforces the neural circuitry of instant gratification, making human attention spans shorter and shorter, and their receptivity to “long narratives” and “logical constructions” diminished.
As mentioned above, some of the fears became reality.In September 2021, a mother found her child dead of suicide in her bedroom. The cause of death was none other than the suicide videos pushed by TikTok, who, amongst their algorithmic push stream, pushed depression and self-harm as normalized content to children with immature worldviews. Whether intentional or not, these algorithmic logics have caused many families to fall from heaven to hell overnight. And it happened in France, the very place where the 2024 censorship began. We have to wonder if this algorithm’s push in the French region was intentionally set up to be of this dangerous type.
In addition to suicides, other types of incidents have also attracted attention. Luis was a twenty-one year old undergraduate student diagnosed with bipolar disorder. When he explained his experience with TikTok to Amnesty International, he said that TikTok’s video recommendation feature is a rabbit hole; even if you don’t like what’s being pushed to you, it jumps out at you again the next time you open it, and if you view it this time around, then that type of video will multiply in the next push.
Addictive, suicide-inducing, and enhancing negativity, TikTok’s algorithms seem to do everything we think is bad. But is that really the case? The truth is that even though there are so many cases that point to the fact that the algorithm seems to be the culprit of everything, the events that have occurred can only be used as results to counter the evil of the algorithm. Are we really being controlled by algorithms in our lives? We don’t know, but what is certain is that the creation of this new technology does bring more worrying facts.
However, from an educational point of view, the question of whether or not to allow young people to come into contact with the Internet at an early age has also become a topic that can be explored. It is unfair to blame algorithms for the harm they cause without considering educational factors. Before young people are able to think independently and have a complete set of values, exposure to such a huge electronic platform is tantamount to handing over their lives to complete strangers. It seems that parents should also take the responsibility of education to prevent such tragedies from continuing. Even so, the TikTok algorithm increases the cost of parental control over children, and poses an unseen potential danger to users.
Just as in my childhood, I didn’t have access to short videos, instead I was more exposed to books and handheld consoles. The inability to disseminate information quickly over the internet disguised as protection from algorithms in the past. Algorithms, which were originally intended as tools to bring a better life to mankind, have now been transformed into tools to squeeze out the remaining time and energy of people. As I wrote above, algorithms, or AI algorithms, are gradually leading us down a rabbit hole, trying to turn our attention and energy into money, power and even management.
When we come back to the algorithmic problem again, perhaps we should look at it from a higher perspective. Is it the fault of the algorithm that is causing these problems to happen, or is it someone’s choice of algorithm. In other words, someone chose to let these problems happen. When we go to conjecture in this way, things move into the realm of agnosticism. We are not on the inside of the algorithmic company and have no glimpse into exactly how the algorithm works. But perhaps what we can do is help those around us stay away from the risks posed by algorithms.
what can we do?
Once we what is an algorithm and the general logic of an algorithm, we know its purpose. The algorithm wants the users to stay on the platform more to generate revenue for the platform. It sounds very simple, but once we know this we are able to minimize the harm that algorithms bring to us through this. For example, when we browse a short video platform like TikTok, we only think of it as a search engine or a tool. Instead of treating it as a necessary part of our daily existence. While the algorithmic push will still have some impact on our search results, it must be better than being overly dependent on these platforms.
In addition to being aware of the risks posed by algorithms, we can also make changes in practice. For example, we can use browser plug-ins such as “uBlock Origin” and “Privacy Badger” to block trackers, conduct regular “Digital Detox”, and limit the time of day we use short videos. Digital Detox” to limit the amount of time spent on short videos per day. At the same time, “algorithmic literacy” should be incorporated into the basic education of young people, so that the next generation has the ability to recognize manipulation and reflect on information. Only when we are not consumed by the rapid development of technology can we protect our bottom line as human beings.
Second, reduce the use of networking platforms. Perhaps this makes us sound like uncivilized cavemen (not really). But the truth is that the huge increase in information flow hasn’t made people progress as fast as they should. There are times when a return to the most primitive way of receiving information can be the best way to get it. Go back to paper copies of books and don’t use ebook reading platforms that have algorithms pushing the stream. It may sound overly cautious, but this can go a long way to minimizing the over-acceptance of information.
Finally, avoiding exposing adolescents to or establishing their own independent value judgments may be the last line of defense. We inevitably still use electronic platforms on a daily basis, so perhaps only empowering our own judgment can serve as a truly effective way to stay out of rabbit holes and addictive content.
In summary, perhaps it is difficult for humans to gain the upper hand against algorithms.
Future might be like this……
Algorithms may eventually become an integral part of human civilization, and instead of resisting it, we should think about how to live with it. Only on the basis of understanding it and reflecting on it can we, with human dignity and freedom, shape a future that is not alienated by technology. This is not a confrontation, but a choice – a choice about ourselves, about our next generation, and about the direction of human civilization as a whole.

Andrejevic, M. (2020). Automated Culture. In Automated Media (1st ed., pp. 44–72). Routledge. https://doi.org/10.4324/9780429242595-3
Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism (1st ed.). NYU Press. https://doi.org/10.18574/9781479833641
Amnesty International. (2023). Driven into darkness: How TikTok’s ‘For You’ feed encourages self-harm and suicidal ideation. Amnesty International. https://www.amnesty.org/en/documents/pol40/7350/2023/en/
Carr, N. (2010). The shallows: What the Internet is doing to our brains. W. W. Norton
Pasquale, Frank (2015). ‘The Need to Know’, in The Black Box Society: the secret algorithms that control money and information. Cambridge: Harvard University Press, pp.1-18.
Be the first to comment