Have you ever experienced the following situation: when opening an online booking platform to search for flight tickets, the price changes after each search, and most of the time it goes up? If you change a device or an account, the ticket price changes again. Registering as a new user can result in a lower price than your previous account. If a platform customizes prices for each user, it might be a clear indication of algorithmic price discrimination.
Other platforms, such as shopping platform Amazon and travel booking platform Orbitz, do the same. These e-commerce platforms collect and analyse user data, then utilize algorithms to predict the highest price every individual user is willing to pay. Customers feel confused and angry with the price discrimination.
Case Review: Flight Pricing Disparity in Fliggy
In January 2024, a video posted on Weibo (one of the most active social platforms in China) attracted public attention. This video exposed the online travel platform Fliggy’s behaviour of pricing different users differently. In the video, a user uses three mobile phones with three different accounts to search for the same flight at the same time, and the fares obtained are completely different, with the difference in fares reaching up to 1,000 yuan (about 200AUD). The huge price gap makes consumers worry about whether they have also been secretly overcharged for previous ticket purchases on Fliggy.
Figure 1. A screenshot of the video
Fliggy is an online travel platform owned by Alibaba, one of the leading B2B e-commerce platforms in China. With over 400 million registered members, Fliggy is a major platform in the Chinese online travel industry. However, Fliggy has faced criticism for abusing price discrimination while being China’s leading online travel trading platform. In the last year alone, Fliggy received 440 complaints over the issue of market-regulated prices, according to Chinese Consumer Information Complaint Publicity Platform 12315.
Figure 2. A screenshot of complaints on 12315
About Algorithmic Price Discrimination
The price discrimination means sellers charge different consumers different prices based on consumers’ willingness to pay (Oren, 2019, p.218). Generally speaking, sellers that practice price discrimination will charge higher prices to users with higher willingness to pay, such as loyal users who have purchased many times, temporary customers who are in urgent need of a certain product. For those who are new users, potential customers in the wait-and-see stage, sellers tend to use relatively low prices to attract and test them, and convert potential users into actual consumption objects.
Big data analysis brings certain predictive capabilities to the platform, the future behaviour of individuals and groups can be derived from observable patterns of past online behaviours. The purchase intention of consumers can also be predicted through big data analysis (Flew, 2021, p.104). These all rely on algorithms running in the background. Algorithm, as Flew (2021) proposes, refers to the rules and processes established for activities such as calculation, data processing, and automated reasoning (p.108). Algorithmic price discrimination enables sellers to parse groups of potential customers into increasingly finer subcategories and then more precisely match consumers with the products and services they need (Oren, 2019, p.221). It not only improves the success rate of transactions, but also enhances the service capabilities of the platform.
But is everything heading in a better direction? It’s not true.
Fliggy’s use of algorithms does not just stop at the exact matching stage, it has reached the level of algorithm abuse – Price discrimination does not improve the quality of a product or service or promote any larger social goals. Instead, companies price discriminates to extract as much wealth from consumers as possible (Ezrachi, 2016, p.123). Such behaviour can maximize Fliggy’s interests, but it has an adverse impact on consumers and the market.
What is the impact?
The impact of algorithm abuse on price discrimination is mainly reflected in two aspects:
Firstly, the fairness and transparency of the consumer market are damaged, and the order of the market environment is destroyed. On the one hand, the coding rules as the core of the algorithm are formulated by Fliggy’s technical team. The core goals and execution logic of the algorithm are affected by the technical team. Behind the rules lies Fliggy’s core value of maximizing benefits, which makes the algorithm run smoothly. It is biased and loses the fairness it should have. On the other hand, algorithm-led pricing strategies lack transparency for consumers, and it is often difficult for consumers to understand the algorithmic rules and logic behind them, which puts users in a weak position during the transaction process. Even if consumers realize they are being treated unfairly, it can be difficult to provide substantive evidence because all intent is hidden behind the algorithm—a black box that is opaque, complex, and difficult for non-experts to understand (Pasquale, 2015, p.2).
Secondly, the risk of user privacy exposure increases. The platform’s accurate analysis of users relies on a large amount of user data, which involves excessive collection of user data. Most of the time, users have no idea when and where Fliggy started recording and inferring purchase intentions. In addition, user data collected by the platform may be re-traded and used for user analysis and prediction by other platforms, which increases the risk of user privacy leakage and abuse.
Let’s take a look at the comments from netizens under that video:
“Making money has always been based on information gap, and in the era of big data, information gap is easily exploited.” @ Anonymous User 1
“It’s 100% price discrimination. If you search for a flight several times in a row, it will increase the price for you. It knows you will definitely go.” @ Anonymous User 2
Short but profound, isn’t it? These comments reveal consumers’ anger and helplessness towards Fliggy’s price discrimination. However, these comments are not so much the public’s condemnation of Fliggy as they are the public’s criticism and concern about the abusive algorithms and big data on e-commerce platforms.
On the governance of algorithmic price discrimination
protecting user privacy
In China, regulatory authorities involved in abusive algorithmic price discrimination include the Internet Information Office, the Market Supervision Bureau, the Ministry of Industry and Information Technology, etc. These authorities have limited the scope of user data collected by e-commerce platforms, and grant users the right to manage authorized information through legislation.
On the one hand, the Chinese Consumer Rights Protection Law requires platform operators to inform users about the collection and use of personal information. It includes clearly stating the purpose, method, and scope of collecting and using data, and obtaining consumer consent. Limiting platform operators’ access to consumers’ willingness to pay can curb algorithmic price discrimination fuelled by big data (Oren, 2019, p.242). The privacy policy of Fliggy lists the scope and purpose of the collected data based on platform functional scenarios, which indirectly clarifies the data sources and usage scenarios of algorithm applications.
On the other hand, the Personal Information Protection Law grants users the right to manage their authorized personal information. This includes the ability to view, modify, and withdraw information authorization on the platform. Fliggy has added information management portals to its privacy policy, making it easy for users to manage their personal information.
Figure 3. A screenshot of Fliggy’s privacy policy
However, China’s law does not clarify the specific scope of user data collection, which means that the platform can use any data as user authorized. If users look at the information list showed by Fliggy carefully, they will find that the scope of required information is quite extensive: browsing history, search records, transaction information and even interaction information, etc. These data can not only support Fliggy to make personalized recommendations, but can also be used to analyse consumers’ willingness to pay. Although users have the right to withdraw authorised information, in reality, only a few people consciously manage personal authorised information.
This relates to the fundamental social conflict facing big data: how to balance the examination of subjects based on causality with the examination of all available information based on relevance (Eber, 2020, p.44). Put more generally, how to make full use of data for analysis and decision while ensuring that the use of user date is within a reasonable range? Up to now, China’s legal regulations only mention the need for disclosure, but have not actually regulated the specific scope of information. The laws related to the limited scope of user information still need improvement.
Figure 4. Fliggy
As the relevant laws are not perfect enough, once price discrimination occurs, it is very important for the regulatory authorities to intervene to safeguard the rights of users. Regulatory authorities must intervene to safeguard the rights of users when price discrimination occurs due to imperfect laws. As for the abusive algorithmic price discrimination in China, although there are many regulators involved, there is no clear division of powers regarding the regulation of algorithmic price discrimination.
In practice, unsupervised or repeated supervision may lead to embarrassing situations (Tang, 2022). Liang (2022) and his team have studied algorithm governance entities in various countries worldwide (p.30). They believe that it is not yet appropriate to establish a new regulatory agency in China. Instead, it could refer to the current joint meeting system of the anti-monopoly and anti-competition departments to establish a new regulatory agency in multiple ministries and commissions (Liang, 2022, p.30). Furthermore, Tao (2022) proposed constructing a government-led pluralistic governance model. The government would play a leading role in coordinating organizations, enterprises, and individuals. This would reconstruct relationships among multiple stakeholders through consultation and cooperation (p.17).
Personalized Disclosure of Prices
Operators shall not set different prices or charging standards for the same goods or services under the same transaction conditions without the knowledge of consumers. ——Regulations on the Implementation of the Consumer Rights Protection Law of the People’s Republic of China (State Council, 2024)
Last month, in March 2024, China introduced administrative regulations that explicitly prohibit the abusive algorithmic price discrimination for the first time. This has become a powerful weapon in fighting against the abusive algorithmic price discrimination, as well as an important piece of evidence for users to secure their rights. The right to non-discrimination is an indispensable right for any pluralistic society, and its purpose is not to prohibit all forms of difference, but only the most egregious and morally reprehensible differences (Eder, 2021, p.32). Therefore, preventing abusive price discrimination does not imply keeping prices of goods constant indefinitely. Oren (2019) suggests that personalised disclosures prevent sellers from setting prices higher than what consumers are willing to pay (p.223). Personalized disclosure refers to sellers providing each consumer with individualized information on the product’s true value to that particular consumer. For example, on the ticket order page, Fliggy explains the quotation obtained by the user, including the initial price, discount information obtained by the user, tax, etc. This allows users to understand the price structure and judge whether the price is higher or lower than their own price expectations, thereby reducing the negative effects of price discrimination.
Figure 5. A screenshot of a personal disclosure on Fliggy
The personal disclosure of hundreds of millions of users is provided by platforms through the use of big data analytics, artificial intelligence, and other data processing methods. This requires significant human and financial investments. Although sellers are willing to put in a lot of effort to determine a user’s willingness to buy, it is hard for them to be willing to continue to invest in helping users identify the unanswerable parts of their willingness to buy. Therefore, personalised disclosures may face implementation challenges.
Conclusion
Algorithmic price discrimination on e-commerce platforms is already a very common phenomenon in the world today, covering many fields such as shopping, travel, and transportation. Such behaviour affects not only market orders but also harms the privacy of users. The Chinese government grants users with adequate rights to safeguard their privacy. The platform discloses the scope of personal information required by the algorithm, and provides personalized disclosure of pricing. This limits the data source and operating scope of the algorithm, establishes a trust mechanism with users. However, China’s relevant regulatory authorities and legislation are not yet complete. Some methods of restricting the operation of algorithms also present implementation challenges. It is believed that through government guidance, corporate execution, organizational and personal supervision, a cooperation network can be established to jointly promote the improvement of the governance of algorithmic price discrimination.
Reference
Oren Bar-Gill. (2019). Algorithmic price discrimination when demand is a function of both preferences and (mis)perceptions. The University of Chicago Law Review, 86(2), 217–254.
Flew, T. (2021). Regulating platforms. Polity Press.
Ezrachi, A. (2016). 12. Behavioral Discrimination: Economic and Social Perspectives. In Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy (pp. 117-130). Cambridge, MA and London, England: Harvard University Press. https://doi.org/10.4159/9780674973336-012
Pasquale, F. (2015). The Black Box Society. Cambridge, MA: Harvard University Press.
Ebers, M., & Cantero Gamito, M. (2020). Algorithmic Governance and Governance of Algorithms: Legal and Ethical Challenges (1st Edition 2021, Vol. 1). Springer International Publishing AG. https://doi.org/10.1007/978-3-030-50559-2
Tang, W. (2022). Legal Regulation of Big Data Discriminatory Pricing behaviour (dissertation). Southwest University of Political Science and Law. Retrieved 2024, from http://202.202.90.45:8080/xnzfdx/item/itemDetail/79491.shtml.
Tao B., & Zhang D. (2022). Curb Swindling Money out of Old Customers by Big Data : A Col laborative Governance Model Led by the Government. Journal of Changzhou University (Social Science Edition), 23(5).
Liang, Z. (2022). China Algorithm Governance Policy Research Report. TsingHua University. Retrieved 2024, from https://aiig.tsinghua.edu.cn/info/1364/1848.htm.
State Council. (2024). Regulations for the Implementation of the Consumer Rights Protection Law of the People’s Republic of China. Retrieved 2024, from https://www.gov.cn/zhengce/content/202403/content_6940158.htm
Eder, N. (2021). Privacy, Non-Discrimination and Equal Treatment: Developing a Fundamental Rights Response to Behavioural Profiling. In: Ebers, M., Cantero Gamito, M. (eds) Algorithmic Governance and Governance of Algorithms. Data Science, Machine Intelligence, and Law, vol 1. Springer, Cham. https://doi.org/10.1007/978-3-030-50559-2_2
Images
Vas A. (2018).macbook-pro-turned-on [Photo]. Unsplash. https://unsplash.com/photos/macbook-pro-turned-on-Bd7gNnWJBkU
Preez PD. (2017). person-using-smartphone [Photo]. Unsplash. https://unsplash.com/photos/person-using-smartphone-BjhUu6BpUZA
Be the first to comment