Data Bill: Not Enough Protection For Consumers
Image for representational use only.Image Courtesy : Livemint
Pegasus’ surveillance of WhatsApp users has come to light, but mass trolling, hate speech, fake news and targeted advertising on social media platforms has silenced data citizens, preventing them from critically engaging with the issue. Of late, artificial Intelligence has taken a leap, machine learning has started to personalise online systems at a rapid pace and the Internet of Things is capturing the identity of users and compromising their autonomy. The Personal Data Protection Bill 2019, when it becomes law, should be able to address these concerns.
When information overload is the norm, consent becomes all the more ambiguous and arbitrary. Section 11 of the privacy bill deals with consent and aims to do so comprehensively, but it does not fully capture the reality of the digital world we inhabit.
The bill is now in the custody of a Joint Select Committee of Parliament instead of being considered by the Standing Committee on Information Technology, as demanded by the Opposition. The Winter Session ended on 13 December, only two days after the bill was introduced in the Lok Sabha. Hence the bill is yet to take final shape, but a perusal of its contents so far are troublesome and deeply problematic: they signal that data is becoming a new weapon for a hegemonic government to wield.
A new taxonomy
The Bill governs personal data and mandates its localisation. Many regimes want to be able to access individual user’s social media and browser data, because they want to monitor their usage for political purposes. Usually such data is stored on servers overseas, which makes accessing it a Herculean task. The assumption is that if data is stored within the territory of a state, it is much easier to have control over usage and monitoring.
With this in mind, the Indian bill mandates data localisation too. It states that companies or “Data Fiduciaries”, as they are called in the draft law, maintain data of the Data Principal, or pertaining to data creators, in India. The effect of this rule on operations and costing of firms will be crucial to observe. Any domestication of data servers will disrupt the competitive advantage enjoyed by data-intensive monopolies and create fissures in investment and international trade.
The key beneficiaries of this will, no doubt, be state law enforcement agencies. They will not have to go through a cumbersome process with foreign territories where a company may be headquartered in order to retrieve data to aid its investigations.
To be fair, citizens repose trust in their government and therefore localisation is a welcome initiative. However, what the bill does not mention is how authorities will use this data once they have accessed it. Are there any specific guidelines, rules, regulations or mechanisms for accountability if the state engages in mass surveillance or policing? The bill is loudly silent on these aspects of data management. What strengthens this shadow of doubt is section 35 of the proposed law. This proposes to allow the government to exempt “any” agency from the applicability of the Data Protection law in the name of national security.
Instead of these safeguards, the bill presses for localisation and access, finding these necessary in order to better target services and help government frame policies that integrate user data with government schemes.
That is not all. The consent of a Data Principal is an essential foundation before processing even such personal data which has been anonymised. Yet, section 12 of the bill allows the government to use data without consent “for the performance of any function of the state”. Howsoever noble its intention might be, having access to citizen data—especially as new technologies enhance data-mining tools like never before—poses a great risk of abuse and unwarranted use. Ethical concerns such as biased profiling, autonomous disbursal of subsidies and harvesting of sensitive citizen data for experimental purposes shakes the very foundations of privacy and responsibility.
Furthermore, section 2(36) defines “sensitive personal data” as encompassing financial, health, official-identifier, sex life, sexual orientation, biometric, genetic , transgender status, intersex status, caste or tribe information, religious or political beliefs and affiliations, and any other data categorised by the law. Section 33 goes even further, by creating a new category of “critical personal data” that remains undefined—until the central government notifies what exactly it means.
What constitutes “critical” personal data, how the word “critical” will be defined and interpreted, not to forget how “different” this would be from “sensitive” personal data cannot be left as a guessing game. This is especially true when a population that has already surrendered its biometric data to the state is concerned [via the Aadhaar project]. Even with the most convincing reassurances on data security, citizens will remain vulnerable. They would be reduced to digitised assets who can be used in discriminatory and exclusivist strategies. In short, meaningful and informed consent—the bedrock of technology law—has become ambiguous with this bill.
Chapter IX envisages the creation of a new institution called the Data Protection Authority of India, which has been given sweeping powers to call for information and to search, seize and conduct inquiries. Orders can be heard by an Appellate Tribunal under Chapter XI with the Supreme Court the final arbiter.
If data is the new oil, the penalty provisions in the Bill make it evident. Processing or transferring of identified and de-identified personal data by a Data Fiduciary is a punishable offence with a fine of Rs15 crore or by imprisonment of up to three years.
E-commerce platforms such as Amazon and Flipkart, ride services such as Ola and Uber and dining apps ubiquitously use personal information to process orders. The information includes free offers, add-on options and several other features tailored by data mining tools that are based on user preferences and history. These “recommendations” and the attendant costs consider the idea of buyer consent voluntary.
As software code becomes the law through automated transactions, there is little scope for a user to find out the internal mechanisms of any purchase. Therefore, meaningful consent is lacking; the only satisfaction is of a quick sale and early delivery of the product.
In all online applications, algorithms act as the managers of data. If the algorithm were to contain biases, if it turns rogue or is not written with consumer interest in mind, the result is a fraudulent transaction. Even worse is a user’s inability to identify the cause and make a claim to the company or to a consumer court. Who is responsible and who will be penalised if it is the algorithm that is deemed the perpetrator? Where does agency and culpability lie when programming has been outsourced or has multiple clients?
Would any company disclose its algorithms, which are often protected by patent or copyright law? To expect this to happen would go against the basic ownership and proprietary rules of company law. In short, Facebook (which owns Instagram, Snapchat, WhatsApp and Messenger) will not make its algorithm public. Nor do the servers of these popular platforms reside in India. While India’s bill imposes heavy penalties for non-disclosure, it does not give consumers the right to control their data as, say, the General Data Protection Regulation (GDPR) formulated in the European Union (EU) do.
The right to be informed, to seek rectification, erasure, restricted processing, portability and to have access to data are basic rights in the EU model. India should also adopt a consumer-friendly framework and make suitable amendments in the bill to accommodate such remedies.
The right to use one’s data should lie with the user and not with the platform-owner. In big data-systems, business models are built on the basis of engaging more data from users, in diversified forms. A study by ASSOCHAM and PwC states that smartphone penetration in India is set to reach 859 million by 2022. This means that mobile usage has taken precedence over education and employment as a high-growth market. This appropriation of information keeps the data citizen in the dark about transparency and accountability, which are essential norms in a democratic society.
How, considering these issues, does one negotiate the rough terrain of data control by large companies that have access to our personal and private information? This bill takes a step in the right direction to make data-holders accountable, but it must include consumer privacy as a non-negotiable right across all forms of usage. This was rightly highlighted by the Justice BN Srikrishna committee report as well as the Supreme Court’s landmark Constitution Bench verdict that read the right to privacy as Fundamental Right.
The hope is that this bill will make the data fiduciaries a key part of Article 21 of the Constitution, without jeopardising Articles 14 and 19.
The author teaches law at Marwadi University. The views are personal.
Get the latest reports & analysis with people's perspective on Protests, movements & deep analytical videos, discussions of the current affairs in your Telegram app. Subscribe to NewsClick's Telegram channel & get Real-Time updates on stories, as they get published on our website.