Algorithms can make life easier: helping or hurting?

Introduction

Our data is analyzed by algorithms

Have you ever thought about whether our lives have been violated by algorithms, when we enjoying the convenience of our lives? You may not be familiar with what algorithms are, so let me give you an example. You went out with your friends yesterday and discussed the items you wanted to buy on the way. The next day, when you opened your phone, you found that the shopping app had push notifications for related advertisements. You were confused because you didn’t search for the items on your phone yesterday. This is the power of algorithms, which can quietly understand your interests and privacy. Today, I want to explore the relationship between our privacy and algorithms. How should we deal with this situation?

Where do algorightms learn about us?

Algorithms acquire data in various ways

Some people may wonder how algorithms work. However, we should first understand that algorithms know us so well because they have learned from our data. Every search we make on the browser, every item you browse and even the chat history with people is the basis of the algorithm’s operation. This data is integrated to form a complete profile, which clearly records our preferences, daily life and even our physical condition.

Are our information protected after being accessed? How should the data be used properly? We can continue to explore this issue based on the article. According to Nissenbaum (2018), we can learn about a concept called Contextual Integrity. The definition of this concept is that privacy is not only about the information itself, but also about the “appropriateness” of the flow of information. Privacy violations come from the flow of information that violates or breaks a specific context (Nissenbaum, 2018, p.833). I will explain the concept with an example. Let’s say you recently had a need to buy a house, and you chatted your ideas about the house, as well as your budget and your financial situation to a real estate company. The flow of this information was appropriate in the context, but then you found that you kept getting calls from other real estate companies, which meant that your information had been sold to third-party companies. This situation is inappropriate. Although you did not mention to them that this was not allowed, such a thing should not need to be specifically mentioned before.

Today’s platforms are also breaking this “appropriateness.” We clearly only searched for information about that item on that platform, but we received a customized advertisement for that item on another platform. Our preferences should only be known to one platform, but the algorithm allows our information to flow on different platforms. As a result, we sometimes feel offended because they have broken the rules. This is how the algorithm works, and it is influencing our lives in an invisible way. The platforms are able to use this data because the terms are hidden in the “Do you agree that the website can access your information?” section, which we never read and usually just agree to. In general, these terms give the website a lot of power, especially for large corporate platforms (Suzor, 2019, p.11).

How do algorithms work?

The company is using our data as a resource

We have already discussed where algorithms acquire our data to understand us. Then how do algorithms work? After collecting our data, algorithms begin to process it and make effective classifications and predictions.

For example, our browsing history, our interests and even our conversations with friends are fundamental to their operation. You may have previously searched the internet for suitable gyms near your company and also learned about how to relieve stress. Then you may be classified as an “office worker who wants to relieve stress through fitness”. Our unconscious searches are recorded by the algorithm and used to understand our preferences and habits. In a way, the algorithm may really send us accurate information because all your data is being tracked by it.

In fact, we can learn more about this concept and its meaning from Flew (2021). In the article, Zuboff (2019) argues that digital platform companies do not simply provide convenient services, but use user behavior data to generate profits to create a new “economic order”, which is the definition of Surveillance Capitalism. And surveillance capitalism is not about improving the service experience on the platform, but about predicting and controlling user behavior (Zuboff,2019, as cited in Flew 2021, p. 78). And there is another important concept in the article called “Behavioural Surplus”. Every step we take on the internet is perceived by the platform as profitable data. In short, we think we are using the platform’s convenient services for free. The truth is we are paying our privacy as price. Our data is analyzed and predicted, and their predictions become a profitable tool. This is the explanation of the concept.

How do algorithm work? It is explained quite clearly in the previous text. In other words, the algorithm decides what kind of content to deliver to us by analyzing and predicting our data. That means everything we see on the Internet is actually determined by the algorithm. The news you view and the stuffs you see on the shopping app are all created by the algorithm after analyzing your records. We now live in an algorithm-based media world.

How is our life controlled by algorithms?

Are we controlling agorithms or we are controlled by them?

I mentioned before how algorithms work, and I made a point at the end:we now live in an algorithm-based media world. The reason is algorithms are invading our lives constantly, they are even influencing our decisions and the world we see.

For example, when we watch short videos, we will keep watching videos that interest us because of the algorithm. From a certain point of view, this does indeed fit our interests well, but on the other hand it also creates a filter bubble. We can only receive one-sided information, and our perspective on the world has diminished a lot. To support this opinion, something related to this is happening in China recently. In daily life, news about women being victims or suffering from domestic violence will only be delivered to women, but not to men. This situation is caused by most girls will usually help to repost news about girls who have suffered violence, but most men rarely care about related matters. This leads to a narrowing of the spread of news, and if not enough people pay attention, the victims will not receive the appropriate help. I think this is a very good example of the algorithm controlling the information we see.

This situation is already not optimistic, and what is more serious is that the platforms that provide these algorithmic services are usually large companies, such as Google. These platforms have slowly developed into a part of our lives, and have even formed a huge ecosystem (Flew, 2021, p.72). This situation means that the platforms not only produce terms that we will not read and are not fair, but also monopolize to some extent and shape the information we see with algorithms (Srnicek,2017, as cited in Flew, 2021, p.73). According to Suzor (2019), we can also learn that companies have absolute power over the way they operate the social media they own. After we accept these terms, we must obey their rules. This means that when we use social media or platforms, we feel that we are free to see the information and use it in the way we want, but in fact we are being controlled. These platforms have a monopoly on the information we see, which further strengthens their control (Srnicek,2017, as cited in Flew, 2021, p.73).

How can we defend our rights?

The data protection is indeed

After reading the previous paragraph, you must be angry that your privacy is being violated by algorithms. What should we do to defend our rights? Should we just let the situation develop? Of course not. We should defend our digital rights and become the masters of our own privacy.

According to Karppinen (2017), we can learn that the idea of calling for the protection of digital rights has generated many reports. Although human rights have been marginalized in the past, they are now the main framework for discussing digital rights and the Internet. Understanding your rights is the first step in defending them. Digital rights are divided into negative and positive rights, and both should be defended (Karppinen, 2017, p. 97). And in the current situation, we need to actively take action to protect our rights from violations, including the power of large companies (Karppinen, 2017, p. 98). We can first defend ourselves by installing privacy-respecting browsers, paying extra attention to the algorithms and biases of the platform. When browsing for information, actively search for different opinions to break the filter bubble. On the other hand, when consuming information, we should maintain a neutral mind and not be influenced by the extreme opinions delivered to you by the algorithm. When we experience unfair treatment by the algorithm, we must question it. According to Flew (2021), he suggests that most complaints about being treated unfairly by the platform are usually not approved, but celebrities can have special cases (p. 23). This is a very unfair situation, thus we should complain more when we face this situation so that our rights will be taken seriously. We should also constantly manage and delete our own data, such as searching and chat history, also don’t forget to disable unnecessary permissions to better protect our privacy. We should demand strong government regulation and not give platforms all the power to the companies that own them. As I mentioned earlier, Suzor (2019) suggests that platforms have almost all the power, and we should avoid this situation by protesting.

We should be masters of our algorithms!

Algorithms do bring us a lot of convenience. It can recommend music, restaurants and even jobs that match our personality. However, it is also silently invading our privacy and even controlling our lives. The invasion of algorithms is not unavoidable. The fact that we are affected by this invasion in our lives actually confirms the disadvantage of technological progress. However, this does not mean that we should completely deny the convenience brought by algorithms. We should be aware of the information obtained in our daily lives. Next time, while enjoying the convenience of algorithms, we should also pay attention to privacy protection. In general, we are still the masters of our own privacy.

References

Flew, T. (2021). Issues of Concern. In T. Flew, Regulating platforms  (pp. 72–79). Polity.

Karppinen, K. (2017). Human rights and the digital. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media and Human Rights (pp. 95–103). https://doi.org/10.4324/9781315619835

Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9

Suzor, N. P. (2019). Who Makes the Rules? In Lawless: The Secret Rules That Govern our Digital Lives (pp. 10–24). chapter, Cambridge: Cambridge University Press.

Be the first to comment

Leave a Reply