CONSENT IN DATA SHARING: WHAT DOES THE LAW REQUIRE IN THE DIGITAL AGE

This article is written by Nisha Kiran, BA-LLB, 3rd – PRESIDENCY UNIVERSITY,BANGALORE during his/her internship at LeDroit India.

Keywords: Data Privacy; Informed Consent; Digital Age; Data Protection Law; Personal Data; GDPR

ABSTRACT

Consent in data sharing has become a cornerstone of data privacy law in the digital age. As personal data flows across borders and platforms, legal frameworks increasingly mandate informed consent before personal information can be collected, shared, or processed. The first line of defense in data protection is ensuring that users’ permission – their consent – is obtained under strict standards. This article examines what various laws require for valid consent in data sharing, looking at global benchmarks like the EU’s GDPR and comparing them with approaches in countries such as the United States and India. It discusses the key elements of legally valid consent (freely given, specific, informed, unambiguous, and revocable) and explores illustrative case laws and examples (from landmark judgments to recent developments) that highlight how consent requirements are enforced in the digital era. The goal is to provide a comprehensive overview of how consent is defined and regulated by law, the challenges of obtaining meaningful consent online, and what emerging data protection regimes mean for individuals’ rights and organizations’ responsibilities.

INTRODUCTION

Consent in data sharing is not a mere formality; it is a legal requirement defined in detail by modern data protection laws. Broadly, consent means an individual’s freely given and informed choice to allow their personal data to be used for specified purposes. However, experience has shown that in the digital context, users often face lengthy privacy policies, “take-it-or-leave-it” conditions, and opaque data practices. To counter this, laws in the digital age have tightened the definition of consent: for instance, under the EU’s General Data Protection Regulation (GDPR), consent must be a “freely given, specific, informed and unambiguous” indication of the user’s wishes, given via a clear affirmative act. In India, the new Digital Personal Data Protection Act, 2023 similarly mandates that consent be “free, specific, informed, unconditional and unambiguous with a clear affirmative action”. These definitions show remarkable convergence on core principles, even as different jurisdictions implement them through their own laws.

In this article, we will explore the legal requirements for valid consent in data sharing across major jurisdictions. We will look at global standards set by the GDPR in the European Union, the sectoral approach of the United States (including laws like COPPA for children’s data), and the evolving data protection regime in India. Along the way, we will highlight key case laws and examples that illustrate how consent requirements are applied in practice – from landmark JUDGMENTS LIKE K.S. PUTTASWAMY (RETD.) V. UNION OF INDIA WHICH AFFIRMED PRIVACY AS A fundamental right, to enforcement actions like the EU’s ruling in Planet49 (2019) that pre-ticked checkboxes do not amount to valid consent. We will also discuss challenges in obtaining meaningful consent in the digital age, such as consent fatigue and information asymmetry, and consider how laws attempt to address these issues. Finally, a critical analysis in the conclusion will tie together these insights, evaluating whether the current consent-centric model effectively protects individuals in an era of big data and ubiquitous information sharing.

GLOBAL LEGAL FRAMEWORKS FOR DATA SHARING CONSENT

EUROPEAN UNION: GDPR AND THE GOLD STANDARD FOR CONSENT

When it comes to data protection and consent, the European Union’s General Data Protection Regulation (GDPR) is widely regarded as the gold standard. The GDPR, which came into effect in May 2018, revolutionized data privacy law worldwide and heavily influenced other jurisdictions. Under the GDPR, consent is one of the six lawful bases for processing personal data, but it is subject to strict conditions. Article 4(11) of GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she… signifies agreement to the processing” of their personal data. In practice, this means that for consent to be valid under EU law, individuals must have a genuine choice and control over how their data is used. They should not be forced or tricked into consenting, and they must understand what they are agreeing to.

Key requirements under GDPR for consent include:

  • Freely Given: Consent must be voluntary, with no coercion or undue pressure. Users should be able to refuse consent without suffering detriment. For example, a service generally cannot refuse service if a user doesn’t consent to data processing that isn’t essential to that service. 
  • Specific and Informed: Consent must be purpose-specific and based on adequate information. The GDPR requires that individuals are told exactly what data is collected, by whom, and for what purpose before they consent. 
  • Unambiguous Affirmative Act: The individual’s consent must be indicated through a clear affirmative action – an opt-in, not an opt-out. Silence, pre-ticked checkboxes, or inactivity do not constitute consent under GDPR. In the Planet49 case, the EU Court emphatically held that a pre-ticked box (which the user would have to un-tick to refuse consent) is not valid consent. 
  • Documentation and Withdrawal: Article 7 of GDPR further requires organizations to keep a record of consent given by users and allows users to withdraw consent at any time as easily as it was given
  • Explicit Consent for Sensitive Data: GDPR distinguishes “special categories” of personal data (racial/ethnic origin, health data, biometric identifiers, etc.), which generally require explicit consent for processing (Article 9 GDPR) unless another narrow exception applies. 
  • Parental Consent for Minors: Recognizing that children deserve enhanced protection, GDPR requires parental consent for processing personal data of children under 16 in the context of online services offered directly to a child (individual EU member states can lower this age to as low as 13). 

INDIA: EVOLVING DATA PROTECTION LAW AND CONSENT REQUIREMENTS

India has been rapidly updating its data protection framework in recent years, especially in the wake of the Supreme Court’s landmark judgment in Justice K.S. Puttaswamy (Retd.) v. Union of India (2017). In that case, a nine-judge bench unanimously recognized privacy (including informational privacy) as a fundamental right under the Indian Constitution. The court’s recognition that individuals have a fundamental right to control the privacy of their personal data created a strong push for comprehensive data protection legislation in India, focusing heavily on consent and user rights.

After extensive debates and drafts (including a 2019 Personal Data Protection Bill that was later withdrawn), India enacted the Digital Personal Data Protection Act, 2023 (DPDP Act). This new law, for the first time, establishes a cross-sectoral data privacy regime in India and is heavily influenced by principles from the GDPR, though with some simplifications and local adaptations.

Under the DPDP Act, consent is the primary basis for processing personal data, except in certain delineated “legitimate uses” where consent may not be required. The Act explicitly states that personal data can be processed only for a lawful purpose, either with the individual’s consent or for certain legitimate uses defined by the law. 

KEY FEATURES OF CONSENT UNDER INDIA’S DPDP ACT, 2023 INCLUDE:

  • Notice and Informed Consent: Before obtaining consent, the data fiduciary (i.e. data controller) must give the individual (data principal) a clear notice describing what personal data will be collected and the purpose of processing, along with details of any third-parties with whom data will be shared, how long data will be retained, and the manner of withdrawal of consent, etc.. 
  • Unambiguous Affirmative Action: The Act requires an unambiguous affirmative action – meaning the consent must be given through a clear positive act by the user (ticking a box, clicking “Agree”, etc.). 
  • Withdrawal and Duration: Individuals have the right to withdraw consent at any time, and it should be as easy to withdraw as to give (similar to GDPR). Upon withdrawal, the data fiduciary must stop processing the data (if consent was the only basis) and delete the data unless it’s required to be retained by law. 
  • Consent Managers: An interesting concept introduced (in earlier drafts and enabled in the Act) is that of Consent Managers – entities that act as guardians of individuals’ consents, helping people manage the consents they have given to various services. 
  • Deemed Consent (Legitimate Uses): The DPDP Act specifies certain circumstances as “legitimate uses” where consent is not required for processing. These are akin to exceptions or alternate legal bases. They include situations like: when processing is necessary for state-provided benefits or services (and the individual has previously consented to a similar service), for compliance with law or court orders, for medical emergencies, for public interest such as epidemics or disasters, for employment-related data (in certain contexts), etc.. 
  • Children’s Data: The Indian law is actually stricter than many others regarding children: it defines anyone under 18 as a “child”. Data fiduciaries cannot process a child’s personal data without verifiable consent of the parent or lawful guardian. Additionally, the Act prohibits certain types of processing for children, like behavioral tracking or targeted advertising directed at children. 

Before the DPDP Act, India’s data protection was governed by the Information Technology Act, 2000 and related rules (notably the 2011 “SPDI Rules” which required written consent for collecting sensitive personal data like health or financial information). 

Case Law Illustration: Even before the new Act, Indian courts dealt with consent and privacy in specific cases. For example, in the Puttaswamy judgment of 2017 (privacy case), the Supreme Court didn’t lay down specific consent rules but cited the need for a data protection law that would, among other things, regulate how consent must be obtained and respected. Another instance was the controversy around the Aadhaar biometric ID system. In Puttaswamy v. Union of India (Aadhaar case, 2018), the Supreme Court upheld the validity of Aadhaar for welfare schemes but struck down its mandatory linking to services like mobile phone connections and bank accounts, partly because such linking (and the data sharing involved) was not backed by law and raised privacy concerns – effectively, people were forced to “consent” to link Aadhaar to every service, which the Court found disproportionate and unconstitutional. While not using the word consent in a contractual sense, the case underscored that individuas should not be compelled without proper safeguards to share their personal data broadly. This resonates with the idea that consent must be free and informed, not coerced by necessity of living a normal life.

In conclusion, India’s approach in the digital age is now largely in harmony with global standards on consent. Consent is king – except where a clear exception is carved out, organizations must seek user permission under strict guidelines. As India implements this new law (in 2024 and beyond, with rules to be framed and institutions to be set up), the culture around data sharing is expected to shift toward more transparency and user control. Companies serving Indian users will have to ensure their apps and websites include granular consent notices, easy opt-outs, and parental consent mechanisms for minors. Given India’s huge internet user base, this legal development is significant globally – it means a large portion of the world’s digital users will be covered by a consent-focused data protection regime.

EXCEPTIONS AND ALTERNATIVES TO CONSENT

While consent is a central pillar of data protection law, most regimes recognize that requiring consent for every data use is not always practical or even desirable. There are situations where insisting on consent could impede necessary activities or where consent might be inherently non-free (such as an employer asking an employee – the power imbalance means consent can be questionable). Therefore, laws provide for exceptions or alternative legal bases where personal data can be processed without consent, as long as certain conditions are met. It’s important to know these, to understand the full picture of “what the law requires” – because sometimes the law requires consent, and sometimes it explicitly does not (or provides a different pathway).

Under GDPR, for example, there are five other lawful bases for processing besides consent:

  • Contractual necessity: If processing personal data is necessary to fulfill a contract with the individual, you don’t need separate consent. E.g., if you buy something online, the seller can use your address to ship it and your credit card info to bill you as part of performing the contract – they don’t need to get your consent for those uses, because the sale contract implies it.
  • Legal obligation: If a law requires the data controller to process or disclose data, consent is not needed (and in fact, consent would not make a difference). For instance, financial institutions have to report certain transactions to government for anti-money laundering laws – they do this because the law obliges them, not because the customer consented.
  • Vital interests: This covers emergencies where data processing is necessary to protect someone’s life or physical integrity. For example, a hospital can share a patient’s info with another hospital in an emergency without consent if the patient is unconscious – it’s in the vital interest of the patient.
  • Public interest or official authority: Data processing by government or other bodies may be done under this basis when it’s needed for tasks in the public interest or in the exercise of official authority (like a census, or public health data collection during a pandemic). Again, individual consent isn’t required for these mandated tasks, though typically there will be other safeguards.

India’s DPDP Act follows a similar concept through its “legitimate uses” (deemed consent) provisions. These essentially cover many of the same exceptions: compliance with law or court orders, medical emergencies, breakdown of public order, etc., and even certain “public interest” and “government benefit” contexts. In those scenarios, the law is saying that society’s interests or the individual’s own vital interests justify data use without the cumbersome process of getting consent from each data principal.

For example, during the COVID-19 pandemic, governments and apps processed a lot of personal health and location data for contact tracing and public health measures. If each instance required consent, it could slow down or impair the public health response. Many jurisdictions therefore allowed such processing under emergency powers or public interest exceptions rather than consent. (There was still debate on how much should have been consent-based – e.g., some contact tracing apps were voluntary/consent-based, others were mandatory.)

Another notable “exception” case: Personal data in a contractual necessity scenario. If you sign up for a service and provide your email, the service can send you transactional emails (password resets, notifications about your account) without needing consent each time – because you gave the email as part of the service. However, if they want to send you marketing emails, that is outside the core service; in many jurisdictions, that does require separate consent (or at least an easy opt-out at first contact, per some email spam laws).

CHALLENGES IN OBTAINING MEANINGFUL CONSENT ONLINE

While laws have become quite detailed about how to obtain consent correctly, the practical reality of consent in the digital age is fraught with challenges. It’s worth examining these issues, as they are often the reason behind certain legal requirements and are a subject of ongoing policy debate:

Consent Fatigue: The average internet user is bombarded with consent requests. Every website has a cookie banner (“This site uses cookies, agree?”), every app installation asks for permissions (“This app wants access to your contacts, allow?”), plus lengthy privacy policies on sign-up forms. This overload can lead to users habitually clicking “Accept” without reading, just to remove the annoyance. 

Lengthy and Complex Policies: Although the law demands clear and understandable information, in practice many privacy policies remain long and full of legal jargon. This undermines informed consent. Efforts are underway to encourage simpler, layered notices (where you see a short summary and can click for more details) and the use of icons or standardized terms to convey key points

Dark Patterns: A modern concern is the use of dark patterns – user interface designs that trick or coerce users into consenting. Examples include confusing opt-out mechanisms, misleading button sizes (e.g., a huge bright “I Accept” button and a tiny hard-to-see “Reject” link), or guilt-inducing language (“Support us by accepting cookies!”). Regulators are cracking down on these. 

Consent vs. Innovation: Some industry voices argue that requiring consent for every little data use can stifle innovation, especially in fields like AI and big data analytics where large datasets (often personal data) are needed to train models or discover insights. They contend that if every data point needed opt-in, beneficial technologies might not develop or would be limited. Lawmakers respond by carving exceptions (like for anonymized data, or for certain research uses under ethical oversight) and by noting that innovation shouldn’t come at the expense of fundamental rights. 

All these challenges suggest that while consent is a necessary tool for privacy, it is not a panacea. 

CONCLUSION

Consent in data sharing is at the heart of modern data protection laws – a fundamental mechanism by which individuals exercise control over their personal information in the digital age. Across jurisdictions, the legal requirements for consent converge on one principle: the individual’s autonomy and choice must be respected when their personal data is collected or shared. Whether under Europe’s GDPR, India’s new data protection statute, or various U.S. laws, obtaining valid consent means ensuring it is informed, voluntary, specific, and unambiguous. The law requires that consent not be a mere checkbox ticked out of habit or coercion, but a meaningful agreement.

However, as we have critically analyzed, achieving the ideal of informed consent is easier said than done in the digital era. Users are often overwhelmed with consent requests and lengthy policies, raising questions about how “informed” or “freely given” their consent truly is. Lawmakers and regulators are aware of these challenges and have been updating guidelines – mandating clearer notices, prohibiting dark patterns that mislead users, and imposing hefty penalties for those who flout consent requirements. The direction is clear: the digital age may have made data sharing ubiquitous, but the law is determined to put individuals back in control through stronger consent standards.

Crucially, while consent empowers users, it is not a silver bullet. We have seen that privacy frameworks also include exceptions (for legitimate interests, public needs, etc.) and additional rights (like the right to access data, delete data, or port data) alongside consent. This ensures that the absence of consent in certain scenarios doesn’t leave a protection vacuum – other safeguards step in. The interplay of consent with these mechanisms reflects a maturing of data protection law: a recognition that privacy requires a multi-layered approach. Consent remains the cornerstone, but it is reinforced by checks and balances.

From a critical perspective, the emphasis on consent in data sharing has its pros and cons. On one hand, it enshrines respect for individual choice, which is ethically and legally powerful – no one should trade away their privacy rights unknowingly or unwillingly. On the other hand, the consent model places a lot of responsibility on individuals to manage their privacy, which can be burdensome. As the digital landscape grows more complex (think IoT devices, AI analytics, cross-platform data flows), expecting users to grasp every consent instance is challenging. Thus, the future might see a combination of consent and accountability measures on data handlers (like making privacy the default, minimizing data collection, etc., so that even if a user consents, their data isn’t misused).

In conclusion, the law today requires that data consent be treated not as a checkbox, but as a process and promise: a process of informing and asking the user, and a promise that their wishes will be honored. The digital age has tested this concept, but also reinforced its importance. High-profile scandals (from Cambridge Analytica to big data breaches) have taught both the public and policymakers that when consent is ignored or engineered, trust breaks down. Therefore, robust consent requirements are here to stay, evolving hand in hand with technology.

Individuals should feel empowered by these laws – knowing that they have the right to say yes or no and to change their mind – and organizations must embed these consent requirements into their data practices. The trajectory is towards greater transparency and user empowerment: a digital ecosystem where data sharing happens only with consent, and that consent truly means consent. In the end, the goal of the law is to ensure that in the vast exchange of data that defines our digital age, the dignity and choice of the individual remain paramount.

References:

  • Justice K.S. Puttaswamy (Retd.) & Anr. vs Union of India & Ors., (2017) 10 SCC 1 (Supreme Court of India) – Indian Kanoon link (Privacy as fundamental right case).
  • General Data Protection Regulation (EU) 2016/679 (GDPR) – EUR-Lex link (EU law defining consent and data protection requirements).
  • Planet49 GmbH, Case C-673/17 (2019), Court of Justice of the EU – Judgment Text (Curia) (Cookie consent case holding that pre-ticked boxes are invalid for consent).
  • Digital Personal Data Protection Act, 2023 (India) – Ministry of Electronics & IT PDF (India’s data protection law detailing consent requirements and legitimate uses).
  • FTC v. Facebook, Inc. (2019 settlement) – FTC Press Release (US FTC action resulting in $5 billion fine for privacy violations related to consent and data misuse).
  • Children’s Online Privacy Protection Act (COPPA), 15 U.S.C. §§6501–6506 – FTC COPPA Guidance (U.S. law requiring verifiable parental consent for under-13 data collection).
  • Justice K.S. Puttaswamy & Anr. vs Union of India (Aadhaar case, 2018), 1 SCC 809 – Indian Kanoon link (Supreme Court of India case discussing voluntary vs. mandatory consent in context of biometric ID).
  • Information Commissioner’s Office (UK) – Guidance on Consent – ICO Guide on valid consent (Explains “freely given”, “specific”, “informed” and how to obtain and manage consent under UK GDPR).
Related Posts
Leave a Reply

Your email address will not be published.Required fields are marked *