The human live cost of fintech lending in Indonesia
Once again, a tragic incident has occurred in Indonesian digital culture, where a digital platform facilitated the loss of lives. In May 2023, a man ended his life after being terrorized by a debt collector from a Financial Technology (Fintech) lending company (VOI, 2023). This incident mirrored the case in February 2019, where a taxi driver was found dead in his room, leaving a handwritten note stating that the Financial Service Authority (OJK) must put a stop to fintech lending, which he referred to as a “devil trap” (TheJakartaPost, 2019).
The issue is that fintech companies have excessive access to borrowers’ data, justified under the “Know Your Customer” (KYC) process, which allows financial institutions to assess the eligibility of borrowers to avoid non-performing loans or money laundering (VOI, 2023). This data ranges from personal identification to more private matters like social media, contacts, and even galleries (Antara, 2021).
As a result, borrowers have been actively blackmailed and subjected to multiple forms of online humiliation by disclosing their privacy to their relatives. This act has put the victims in a serious condition, facing depression over how to repay the loan and public humiliation (TheJakartaPost, 2019). The Jakarta Legal Aid (LBH) Institute has reported that almost 300 people have experienced blackmail by exposing their privacy online at the hands of debt collectors (TheJakartaPost, 2019).
What is Fintech Lending, and How Does it Become a Challenge to Privacy Protection?
Fintech lending represents a technological innovation that has emerged as an alternative solution for those seeking lending options outside of rigid traditional financial institutions. Fintech lenders manifest as online startups that eliminate human interaction, making the lending process quicker, more efficient, and flexible enough to meet people’s needs (Chou, 2020). Due to its flexibility, fintech can provide broader access to credit, even for unbanked individuals (Chou, 2020). Consequently, fintech lenders have become popular among underserved or marginalized groups who have been deemed ineligible for loans by traditional financial institutions (PwC, 2019).
Unlike traditional lending institutions, which require customers to physically visit, fill out forms, and undergo interviews to assess loan eligibility, fintech lenders utilize a wide array of data and technology (Alekseenko, 2022). They efficiently grant loans or determine consumer interest rates through online apps (Chou, 2020). However, a challenge arises with unbanked individuals, as they are not recorded in the financial system, making it difficult to score their risk (Chou, 2020). Fintech offers a solution by expanding the types of data collected from customers, including rent and utility payments, job stability, and monthly expenditures (PwC, 2019).
In addition, the absence of regulation explicitly addressing the types of data that fintech companies can collect has become a significant concern. Many companies expand their data collection from financial related to “person’s character” data under the premise that a person’s personality could determine their ability to repay a loan (Chou, 2020). Consequently, fintech lenders collect more private information from their phones, such as contact lists, galleries, and social media data (Alekseenko, 2022). Despite being obtained through consumer consent, this expansion of data collection presents a profound threat to privacy (Alekseenko, 2022).
The Unresolved Challenges of Fintech Lending Regulation
The Government of Indonesia has taken steps to address the problem of privacy and data protection by issuing several regulations (Suryono et al., 2021). In 2022, the OJK amended its Fintech Regulation with the issuance of OJK Regulation 10/2022 (Idn), which mandates that fintech companies must obtain consumer consent for data processing. Furthermore, Indonesia enacted Law No. 27/2022 on Personal Data Protection (PDP) (Idn), which stipulates that any person or entity processing data related to Indonesian citizens is obligated to adhere to data protection principles. This law applies across all business sectors, including digital platforms. Lastly, in 2023, OJK issued Circular Letter No. 19/SEOJK.06/2023 (Idn) concerning fintech services, which served as guidance for fintech business operations, including a prohibition on using emergency contacts as a method of debt collection.
However, despite all the government’s efforts, the issue persists, leading to questions about the regulation’s effectiveness in combating online harms.
Therefore, I argue that the problem lies in the regulation gap detailing the specific data that could be obtained by fintech. Even though the law incorporates principles of purpose limitation and data minimization, the government still needs to set boundaries and detail the kind of data that can be collected in the KYC process. Failing to do so is akin to leaving a “blank check,” granting businesses discretionary authority to gather excessive user information, thereby threatening their privacy (Laurinaitis et al., 2021). This contradicts the aims of personal data protection in the first place.
Privacy as a Digital Right?
The academic debates surrounding privacy in the digital age revolve around two central poles (Smith et al., 2011). The first scholars argue that digital rights are an extension of human rights (Karppinen, 2017). Scholars with this viewpoint stress the significance of human rights concerns in online realms, such as privacy and data protection (Karppinen, 2017). Rengel (2013) posits that this argument refers to the phrase “the right to be let alone,” advocating that individuals should be shielded from unwanted access by others. This includes the right to control personal information, the protection of individual dignity, and individuality (Rengel, 2013). Thus, the right to privacy should be safeguarded as a guarantor of physical security and control over one’s personal space (Flew, 2021).
On the other hand, the era of platformization, which uses data as fuel for its business processes, presents a challenge to privacy protection (DeVries, 2003). The trade-offs between personal data and access to services or lending have become a significant issue for privacy protection (Flew, 2021). When privacy is applied to consumer behavior, a “privacy paradox” emerges among consumers (Flew, 2021). Despite their privacy concerns, they consciously submit their personal data in exchange for access to services (Smith et al., 2011).
This leads to the debate that privacy is not an absolute right but instead can be rationalized as an economic value by individuals calculating the cost-benefit (Smith et al., 2011). Consequently, the challenge of privacy protection lies in striking a balance between rights and economic value in the regulatory framework, which the government often neglects as they tend to lean only toward one pole.
The Problem of People at the Margin
So far, we have learned that privacy is challenged by the economic value trade-off in obtaining access to lending. But, if no coercion is involved in the data collection process and people can choose any financial institution, what has gone so wrong with this?
One might easily argue that if the customer truly values their privacy, they should simply stop using fintech lending and opt for traditional banking instead.
However, this argument overlooks the complexities highlighted by Marwick (2018), who criticizes the difficulty of achieving privacy for people who are marginalized in life. Digital rights posit that every person deserves to be “let alone”; however, in reality, those who can protect their privacy are often the ones who have economic or social privileges (Marwick, 2018).
Wealthy individuals have the luxury to choose financial institutions that align with their preferences and can also be prioritized for privacy protection. In contrast, unbanked, underserved, or poor people may find themselves “forced” to provide any information requested by fintech lending platforms in exchange for access to loans (Chou, 2020). The more someone is marginalized from the financial system, the more challenging it is to “opt-out” of sharing personal information (Marwick, 2018).
As outlined by Flew (2021), regulating a platform is not an easy task. The tech industry often portrays itself as engaging in a give-and-take dynamic, where voluntary personal information disclosure is exchanged for benefits. As data becomes the fuel for this industry, the distinction between choice and coercion becomes increasingly blurred (Marwick, 2018).
However, the tragic outcomes, including people’s deaths or suicides due to privacy breaches, have undeniably crossed the line of humanity. This situation underscores the urgent need for government intervention to establish clear boundaries that balance the KYC process with privacy protection, ensuring that the sanctity of life is preserved above all (Laurinaitis et al., 2021).
Purpose Limitation to Stop Online Harms
I am not suggesting that fintech lending should loosen its KYC process, as the financial sector plays a vital role in macroeconomic stability. We don’t want our financial industry to collapse, like the 2008 crisis. I argue that data minimization and purpose limitation principles should be implemented in a balanced manner to protect privacy (Forgó et al., 2017). These principles are crucial to save privacy.
Purpose limitation is the principle that personal data collected should be specified before the data collection process, and information collection should be restricted to what is necessary for specified purposes (Nissenbaum, 2018). Furthermore, collected personal data should not be disclosed, made available, or used for another purpose except with the consumer’s consent or by the law’s authority (Nissenbaum, 2018). This means that the principles depend on the “purpose” set by the data collector or the government that can specify the purpose (Laurinaitis et al., 2021).
Consequently, without specific regulation, businesses acting as data collectors could rationalize their motives for the KYC process, even if these are not logical (Alekseenko, 2022). For example, fintech lending might request consumers to grant access to their contact lists and galleries under the guise of using them for emergency contacts or risk profiling.
However, in reality, this data is often not utilized for these stated purposes; instead, it becomes a tool for blackmailing customers and violating their privacy by exposing personal information to their relatives listed in the contract (TheJakartaPost, 2019). Debt collectors frequently threaten borrowers by exposing their private gallery content to coerce repayment of debts, leading to public shaming of the borrower (Antara, 2021). This situation poses a significant challenge as it strips borrowers of control over their data.
Call for an Action
Even though Indonesia’s regulations on fintech stipulate rules to obtain consent and follow data collection principles, the problem persists due to a lack of detail in specific data minimization and guidance for purpose limitation (Laurinaitis et al., 2021). This is particularly critical in financial sectors where the data is mostly sensitive. A clear line must be drawn to adhere to these principles. The most recent specific policy addressing purpose limitation was issued in OJK Circular Letter No. 19/SEOJK.06/2023 (Idn), which prohibited using emergency contacts as a method for debt collecting.
However, according to some administrative law scholars, a circular letter is not part of the hierarchy of Indonesia legislation under Law No. 12/2011 on the Establishment of Legislation (Idn) (Bintoro et al., 2018; Widiarto et al., 2022). This means that a Circular Letter does not hold any authority to act as a state regulation but can only be used as internal guidance within the institution (Bintoro et al., 2018). If the OJK intends to regulate and legally bind matters regarding this issue, then it should be issued under an OJK Regulation instead of a Circular Letter. This situation presents a challenge to the Constitution and raises questions about the judicial power held by a circular letter (Bintoro et al., 2018).
Reference List:
Alekseenko, A. P. (2022). Privacy, Data Protection, and Public Interest Considerations for Fintech. In H.-Y. Chen, P. Jenweeranon, & N. Alam (Eds.), Global Perspectives in FinTech: Law, Finance and Technology (pp. 25–49). Springer International Publishing. https://doi.org/10.1007/978-3-031-11954-5_3
Antara. (2021, November 27). Government’s role in safeguarding citizens from illegal online loans. Antara News. https://en.antaranews.com/news/201717/governments-role-in-safeguarding-citizens-from-illegal-online-loans
Bintoro, R. W., Shomad, A., & Prasastinah Usanti, T. (2018). Standard Issuance of Circular Letters in The Implementation Of Judicial Power. SHS Web of Conferences, 54, 03021. https://doi.org/10.1051/shsconf/20185403021
Chou, A. (2020). What’s in the “Black Box”? Balancing Financial Inclusion and Privacy in Digital Consumer Lending. Duke Law Journal, 69(5), 1183–1217. https://scholarship.law.duke.edu/dlj/vol69/iss5/4
DeVries, W. T. (2003). Protecting Privacy in the Digital Age. Berkeley Technology Law Journal, 18(1), 283–311. https://www.jstor.org/stable/24120519
Flew, T. (2021). Regulating platforms. Polity Press.
Forgó, N., Hänold, S., & Schütze, B. (2017). The Principle of Purpose Limitation and Big Data. In M. Corrales, M. Fenwick, & N. Forgó (Eds.), New Technology, Big Data and the Law (pp. 17–42). Springer. https://doi.org/10.1007/978-981-10-5038-1_2
Karppinen, K. (2017). Human rights and the digital. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media and Human Rights (1st ed., pp. 95–103). Routledge. https://doi.org/10.4324/9781315619835-9
Laurinaitis, M., Štitilis, D., & Verenius, E. (2021). Implementation of the personal data minimization principle in financial institutions: Lithuania’s case. Journal of Money Laundering Control, 24(4), 664–680. https://doi.org/10.1108/JMLC-11-2020-0128
Law No. 12/2011 on the Establishment of Legislation (Idn).
Law No. 27/2022 on Personal Data Protection (PDP) (Idn).
Marwick, A. E. (2018). Understanding Privacy at the Margins | Introduction.
Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9
OJK Circular Letter No. 19/SEOJK.06/2023 (Idn).
OJK Regulation 10/2022 on Peer-to-Peer Lending Sector (Idn).
PwC. (2019). Indonesia’s Fintech Lending: Driving Economic Growth through Financial Inclusion—Executive Summary (PwC Indonesia-Fintech Series). https://www.pwc.com/id/en/industry-sectors/financial-services/fintech-lending.html
Rengel, A. (2013). Privacy in the 21st Century. In Privacy in the 21st Century. Brill Nijhoff. https://brill.com/display/title/24202
Smith, Dinev, & Xu. (2011). Information Privacy Research: An Interdisciplinary Review. MIS Quarterly, 35(4), 989. https://doi.org/10.2307/41409970
Solove, D. (2006). A Taxonomy of Privacy. University of Pennsylvania Law Review, 154(3), 477. https://scholarship.law.upenn.edu/penn_law_review/vol154/iss3/1
Suryono, R. R., Budi, I., & Purwandari, B. (2021). Detection of fintech P2P lending issues in Indonesia. Heliyon, 7(4), e06782. https://doi.org/10.1016/j.heliyon.2021.e06782
TheJakartaPost. (2019, February 12). Man commits suicide to escape “devil’s trap” of debt from fintech firms—City. The Jakarta Post. https://www.thejakartapost.com/news/2019/02/12/man-commits-suicide-to-escape-devils-trap-of-debt-from-fintech-firms.html
VOI. (2023, September 21). Customers Suicide After Being Terrorized By Debt Collector, Ada Kami Boss Will Completely Investigate. VOI – Waktunya Merevolusi Pemberitaan. https://voi.id/en/economy/312765 Widiarto, A. E., Dahlan, M., & Arrsa, R. C. (2022). The construction of legal basis relevant to the state of law in the event of pandemic emergency: A lesson from Indonesia. Legality : Jurnal Ilmiah Hukum, 30(2), Article 2. https://doi.org/10.22219/ljih.v30i2.23553
Be the first to comment