Nov 01, 2024 11:26 AM IST
The framework of consent has been designed to protect the individual’s privacy, but we must confront difficult scenarios where data can be owned by multiple parties, including social groups and platforms
The use of technology holds the promise of making health care accessible and cost-effective. Both public and private entities in India have committed substantial resources to this sector. Ayushman Bharat Digital Mission (ABDM), for example, follows a technology-driven approach, with a core emphasis on ensuring fairness and equal access in health care for all. By collecting, processing, and sharing health data, the mission aims to empower patients to make informed health care decisions in real-time. Presently, the health sector in India is struggling with concerns such as inadequate infrastructure, lack of healthcare professionals and heavy out-of-pocket expenditure. Digitisation can be instrumental in addressing these challenges, but there is a risk of privacy violations, the disproportionate impact of which is likely to be borne by vulnerable groups.
The digital health system relies on data exchange between patients and service providers, raising privacy concerns at multi-party levels. The recently enacted Digital Personal Data Protection (DPDP) Act makes consent central to personal data collection and processing. However, the lack of transparency in consent mechanisms leaves users unaware of how their data will be used and with no negotiation power against platforms. For example, if apps start nudging users with misleading ads, based on their assembled data from different data points, urging them to plan pregnancy in a certain way, it can blur the line between medical advice and advertising, all under the guise of personalisation. This exploitation of expectant mothers’ emotions for profit can harm both their physical and mental well-being, and that of their unborn children.
Further, the framework of consent has been designed to protect the individual’s privacy, but we must confront difficult scenarios where data can be owned by multiple parties, including social groups and platforms. For example, when consenting to share health information with platforms, individuals may disclose genetic data about relatives who have not provided consent. Combining genetic data with other datasets such as sleep pattern, step counts, and stress level can predict their possible health conditions such as diabetes and heart diseases. Insurers and wellness apps may come together and use these data sets to profile patients and assess risks. Based on the risk assessment, they can offer discounts on insurance premiums. While this penalises vulnerable individuals and holds them responsible for their health outcomes, it further compounds the challenges for those lacking the required technologies, literacy, or healthy lifestyle.
The DPDP Act has also introduced the concept of deemed consent when it is for “any fair and reasonable purpose”. If a patient is giving their medical information in a clinical setting, and it is reasonable for them to do so due to established trust, this would be considered as granting access for processing if deemed necessary. However, this may lead to unwanted disclosures of health conditions, potentially forcing individuals with stigmatised health conditions to choose privacy over health care. For example, a patient might share their HIV report with a doctor, expecting confidentiality. If Health IDs become accessible to all family members and insurance companies, as permitted under ABDM, patients would lose control over deciding who can access their data and when. Additionally, data will be shared at multiple levels in the ABDM system, making it difficult to assess risks and fix accountability.
Deemed consent allows employers to use personal data of their employees without explicitly seeking consent. However, the presence of wellness apps such as fitness trackers and pregnancy apps adds complexity to this scenario. These apps are being increasingly used by employers, collecting data that includes sleep patterns, intimate details like sexual activity, and emotional states. This information can be accessible to employers, insurers, and third-party administrators tasked with managing medical claims. If this data is used to assess medical claims or job performance, employees who fail to manage their app settings may grant consent or be subjected to deemed consent, risking loss of health security or even their job. This has the potential to realign the relationship between employees and employers, disproportionately affecting women.
To maximise the potential benefits of technological innovations, we need to make associated efforts to improve data access points while preventing its misuse. Given the challenges of digital literacy and consent in data sharing, we need an independent risk assessment group to regularly audit health care providers’ data practices. It should establish accountability for data collection, purpose, sharing, safeguards, privacy labels, and frameworks to address unintentional harms. Making these indicators public would enable consumers to choose services that prioritise privacy. Further, incorporating privacy by design such as multi-party computation, federated learning, and penalising re-identification attempts can protect consumers.
Asheef Iqubbal is a technology policy researcher at CUTS International. The views expressed are personal
Get Current Updates on…
See more
Story Saved
// // //