India’s ambitious attempt to protect children online through its draft Digital Personal Data Protection Rules rests on a shaky foundation: it depends on children voluntarily declaring they’re underage and assumes that parents have a higher degree of digital literacy, triggering a complex verification system that experts say is inherently flawed yet might be the best available option.
The rules, at present a draft released on Friday for public consultation until February 18, represent India’s most comprehensive attempt yet to protect digital privacy of its citizens, and sets in place specific protections for children.
They mandate parental consent for users under 18, require platforms to verify guardians’ identities through undefined mechanisms, and prohibit behavioural tracking of minors. But experts say the effectiveness of these stringent measures hinges on trusting children to voluntarily identify themselves as minors, highlighting fundamental challenges in protecting young users online while preserving accessibility and privacy.
“The fundamental question is – what happens if I don’t give my correct age? How will the entire mechanism work?” asked technology lawyer Gowree Gokhale, highlighting the core challenge facing India’s new data protection framework.
Under Section 9 of the Digital Personal Data Protection Act, data fiduciaries must obtain “verifiable consent” from parents or lawful guardians before processing data of anyone under 18. The section also prohibits processing data that could harm children, targeting them with ads, or tracking their behaviour. For not complying with these obligations, the Data Protection Board can impose penalties of up to ₹200 crore. While the framework is comprehensive, its implementation begins with a crucial weakness: self-declaration.
“This typically may not work as who will say no to it, thereby failing the government’s stated objective of providing a safe online environment for children,” noted Aprajita Rana, partner at AZB & Partners. “The only scenario where this works is if the parent is either sitting with the child as the account is created, or creates the account on their behalf.”
The other significant problem with this regime is that it assumes that parents are always more digitally literate than their children and can give informed consent on behalf of their children. “Our research has shown that digital literacy imbalance exists within families where children show higher digital literacy than parents and operate basic digital services like payment services,” Siddharth P, co-founder of Rati Foundation, an NGO that runs a helpline for online safety, said.
The draft rules represent one of the world’s most stringent approaches to protecting children’s data, requiring parental consent for users under 18. The United States’ Children’s Online Privacy Protection Rule (COPPA) defines a child as someone under the age of 13 years while Europe’s General Data Protection Regulation (GDPR) allows member states to define age of digital consent between 13 and 16.
Even major platforms have acknowledged the inherent challenges in age verification and the trade-offs made between effective age verification and the principle of data minimisation. Meta’s global head of Safety, Antigone Davis, speaking to HT in November, had described it as “a challenge for the entire industry” where “no age mechanisms work perfectly.” Instagram’s current approach combines multiple methods – from basic age declarations to photo IDs and video selfies – and even analyses public posts for age-related signals, such as birthday wishes that might contradict stated age. However, Meta had sought exemptions to use behavioural data for age assessment of children, something the proposed rules only permit for verifying adulthood.
Siddharth pointed out that age gating may also lead to tradeoffs between digital access and privacy on the one hand, and security on the other. “Digital divide is prevalent in India, especially along gender lines. Our research shows that girls get much less access to digital services and devices. Age gating codifies permission-seeking behaviour that is a social expectation in girls. So when such mechanisms come into place, there is surveillance within the family as well. This could lead to exclusion of girls from digital spaces,” he said.
Technical challenge
Age gating, the process of restricting access to services based on age, can range from simple pop-ups requiring users to confirm their age to complex age verification systems involving government IDs or facial scanning and biometrics.
For verifiable parental consent, platforms must accomplish four tasks: determine if the user is a child, verify an adult’s identity, establish the relationship between them, and retain a record of the adult’s consent.
The verification process creates multiple paths and challenges.
For major platforms like Facebook, existing adult users’ data could verify parental identity. But smaller platforms face a heavier burden, needing to either collect government IDs or integrate with third-party digital locker service providers (DLSPs) as proposed in the DPDP Rules.
“Big Tech does have an advantage here. It might be harder for smaller data fiduciaries to adopt these measures and access DLSPs as they will need more robust security measures, leading to lags in the short term, though overall this is good for user privacy,” Rana noted. The disparity raises concerns about competition and implementation feasibility.
The government has acknowledged these challenges. In a July meeting with social media companies, senior ministry of electronics and information technology officials emphasised the need to balance child protection with technical feasibility. An official aware of the matter, told HT on Saturday that the government is hoping for solutions through the consultation process.
Technical limitations compound these challenges. A second government official familiar with the working of DigiLocker, Indian government’s DLSP, said on Saturday, that “there is no solution yet for age verification that does not simultaneously lead to identification of the individual.”
Even basic age verification through DigiLocker’s API requires including names, as multiple family members might share devices.
The rules provide exemptions for certain sectors, including healthcare providers, educational institutions, and childcare centres. However, these exemptions have raised concerns about their scope and implications.
“Section 9(3) ideally needs to be split into two so that DFs are not needlessly exempted from restrictions on targeted ads even as they are allowed to monitor behaviour online to assess age of the users,” multiple experts told HT. “If a 16-year-old girl goes to a gynaecologist, she may need parental consent but how does that allow the healthcare institution to target her with ads?” asked Rana.
Some experts advocated for a more nuanced and graded approach. “For non-intrusive websites that are meant for information, why should verifiable parental consent be required,” asked Nikhil Narendran, partner at Trilegal.
Aparajita Bharti, founding partner of The Quantum Hub Consulting, suggested news websites, while acting as data fiduciaries, should be exempt from parental consent requirements even if they can’t monitor children’s behaviour or target them with ads.
The framework also creates unexpected data implications. Linking parent and child accounts generates new datasets about verified parents’ online behaviour. “There is no restriction on how this data associated with the parents’ accounts can be monetised by the DFs,” Rana pointed out.
Gokhale added that neither the law nor the proposed rules address problems associated with data sharing, enrichment, and cross-analysis using varied datasets.
Purpose-based exemptions in the rules have sparked additional debates. Two exemptions – exercising power in a child’s interest under Indian law and blocking detrimental information – are “extremely broad,” according to Rana.
“Who determines what is detrimental for a child? Is factual news about a violent incident such as a rape or a riot detrimental to the well-being of a teenager? Should children not have access to political content, conspiracy theories or misinformation? The government needs to define the classes of harm they are trying to protect children from so that in attempt to make safer internet, access to educational and historical information is not curbed.”
Even seemingly straightforward exemptions raise complex questions. The rules exempt email account creation from parental consent requirements, but as Gokhale pointed out: “Today, when you sign up for an email account, you may be simultaneously signing up for allied services. How do you delink those?” For instance, signing up for Gmail signs users up for Google Maps, YouTube, and other Google services.
Rana said that companies such as Google and Microsoft may have to consider delinked accounts that only offer email services for a user, who while signing up, declares that they are a minor.
Bharti, however said that this exemption exists to allow schools to create email addresses for students on the school’s domain. But she asked if the exemption would allow a Google to serve ads in the Gmail account assigned by the school to the minor.
The rules’ reliance on digital locker service providers (DLSPs) could also face practical hurdles. Despite the Digital Locker Authority’s existence since 2016, no digital lockers are currently listed in its directory. The limitations are visible in real-world applications: “This is why the onus for seeking parental consent for APAAR ID has been transferred to schools, and schools are taking this consent manually, through paper consent forms. No virtual tokens are being generated to access records in DigiLocker for creating APAAR ID,” the official aware of DigiLocker’s functioning and quoted above said.
The ambiguity in the rules means that experts differ on whether or not age verification is required for everybody on the internet.
For Gokhale, effective verifiable consent means that age verification needs to be implemented for everyone. “Anonymity then is a thing of the past,” she said.
Rana, however, says that despite the challenges, imperfect solutions might be necessary.
“While it is true that anonymity for the child in question – by virtue of linking their account to an adult account – and that of the linked parent or guardian is gone because they have to actually prove their age and identity and it is an imperfect mechanism from the get go, it is perhaps the only mechanism that doesn’t erode privacy for everybody using digital services. The alternative where everyone has to verify their identity in some formal way before using the internet is riskier,” Rana explained.