top of page
  • Tanya Singh & Kushagra Goyal

Targeted Health Advertisements: A Double-Edged Sword

Introduction

Global digital health market size is slated to reach USD 509.2 billion by 2025[i]. The steep rise in the growth of this sector can be attributed to the need for affordable healthcare alternatives, technological innovations in medical devices (both wearable and surgical), advent of artificial intelligence in healthcare, better internet infrastructure across the world, and the aggregation of information through big data. To attain these goals and to tap into the financial potential of digital health market, key stakeholders like doctors/physicians, hospitals, medical corporations, pharmaceutical companies etc. are increasingly relying on targeted health advertisements today.[ii] However with increasing corporate greed among advertising companies and tech giants, our data is increasingly unsafe in their hands raising privacy concerns for the users.


In this blog post we take a look at the functioning of targeted health advertisements and the entities involved. We gauge the role of big data in targeted health advertisements and the legal issues surrounding it. A brief overview of the legal position adopted by EU and the USA with regards to targeted advertising and the overall user privacy regime has been analyzed. We also take a look at how India could deal with the future challenges arising from targeted health ads with its existing laws and jurisprudence on privacy.


What are Targeted Advertisements?

Targeted advertisements are a form of online advertisements meant to reach a specific audience on the basis of their browsing history, purchase history, online behavior, geolocation, preferences, etc. These specific traits are collected from various websites and social media platforms in the form of “user data”, saved as cookies.[iii] These cookies are shared with ad networks which then show us personalized advertisements.


For instance, if an individual is looking for a diet plan for weight loss, it is likely that s/he will come across multiple advertisements of health apps which provide a diet planning service, pop up ads of weight loss methods or receive follow requests from health and wellness pages on their social media accounts like Instagram, among other forms of targeted health advertisements. Thus advertisers target consumers by tracking their internet activity and showing them products and services that they think might interest the consumers.


It is also important to understand that advertisers and marketers rely on “Big Data” to achieve their goals of effective targeted advertising. Big data is nothing but giant volumes of information collected from websites, online searches, download patterns, online transactions, app and ad preferences, age, education, etc. This information is collected by data brokers and is processed by them using big data analytics to extract meaningful information relevant for the purposes of advertisers and marketing companies.[iv]


Targeted Health Advertisements

Healthcare industry is contributing to big data at a tremendous rate with the digitization of clinical results, lab data, radiology reports, patient demographics, and patient medical history etc.[v] Electronic health records or EHRs are increasingly being adopted by countries across the world and EHRs by design are made to be shared seamlessly across various health professionals and institutions for improving medical research and patient outcomes.


In India, with the exception of Kerala’s e-Health programme,[vi] the country does not have a standardized EHR system. India with its mixed healthcare system of central government, state government and private healthcare providers, has no standardized data storage system or data interoperability. There is also privacy and data security challenges concerning EHR and the only existing legislation on the protection of health data is section 43A of the IT Act.[vii] Section 43A is a limited provision in that it only covers incorporated entities within its ambit, leaving out many health establishments from its scope.[viii] The result is that the health data of Indian consumer remains unsafe and prone to breach by data vendors, advertising companies etc.


Even IOT medical devices like wearable devices, biosensors, clinical devices for monitoring vital signs etc. are generating large chunks of health data which is being collected by tech giants like Facebook, Google, and Amazon etc.[ix] Thus a multi-billion dollar market exists for medical data and marketing companies are able to buy it in anonymised forms to use it for advertising purposes.


Issues with Targeted Health Advertisements

Targeted health ads are beneficial as they introduce health seekers to a plethora of relevant healthcare providers, medical devices and treatment options and apprise them of the current trends in the health market. It reduces the time and efforts one would normally require to look for similar services themselves. However, targeted ads are a double edged sword and come with their own set of challenges.


To illustrate, a man searched for a device to help with his sleep apnea, and started noticing ads for similar devices when he visited other websites. An investigation into the case by the Office of the Privacy Commissioner of Canada revealed that the ads were delivered by Google’s AdSense service. It was found that Google had violated Canada’s privacy laws by targeting sensitive personal data such as a person’s health.[x]


In another investigation into data brokers and their practices, World Privacy Forum found that data brokers were selling lists of AIDS patients, rape victims, alcohol-addicts, people suffering from dementia etc.[xi] Given the sensitive and private nature of such medical data, most jurisdictions guarantee its protection under their privacy or health laws. The fact that data brokers still manage to gain access to sensitive personal information and trade it without the informed consent or knowledge of the consumers/patients is worrisome. Data such as lists of rape and domestic abuse victims could severely impact a victim’s life if it fell in the hands of predators. Further, selling lists of senior citizens suffering from dementia could easily expose them to scam artists. AIDS patients, persons belonging to the LGBTQ community, dyslexic individuals also stand to face discrimination. Even anonymised medical data is still associated with age, gender, partial zip code, and a doctor’s name and the re-identification process is only getting easier with sophisticated data mining tools.[xii]


Additionally there has been a rise in advertisements of unscrupulous medical products and scientifically unproven medical treatments which put the targeted consumers at great risk of dangerous health outcomes. To curb the rise in such ads, Google in 2019 introduced a new policy that will prohibit ads that have no established biomedical or scientific basis[xiii], although its implementation remains to be seen.


Legal Position of Targeted Health Advertising: USA, EU & India

Right to freedom of speech and expression is not absolute.[xiv] The State has a duty to strike a fair balance between the Right to Reputation and the Right to Freedom of Expression. This places a duty on the State to safeguard digital privacy[xv] of the users.[xvi] States recognized the need for flow of information in a globalized world while maintaining individual privacy.[xvii] They imbibed the principle that Internet plays a fundamental role in extending the freedom of expression to the users. It is in light of this recognition that the Internet Intermediaries, i.e. internet service providers/ content hosting websites, enjoy immunity under laws of various jurisdictions. They act as a bridge connecting advertisers to their customers.


USA Model of Monetary Loss of Privacy and Limited Immunity

There exists a direct correlation between efficiency of advertising and the ability to perfectly identify users.[xviii] The Federal Courts have accepted that acquisition of browsing information of a user in the forms of cookies is a valid means for targeted advertising.[xix] However, there exists an admission of the fact that firms while on their quest to attain efficiency for targeted advertising can over-invest in information collection.[xx] The concern for privacy arises at this juncture, where firms collect user data for which the user has not consented.


Two USA legislations, namely, Electronic Communications Privacy Act of 1986 (ECPA)[xxi] and Health Insurance Portability and Accountability Act (HIPAA)[xxii] are relevant for discussion to elucidate the USA jurisprudence.


For a privacy violation under the ECPA, firstly, private information must intentionally be disclosed/ transferred without the consent of the user. Secondly, there exists an agreement between the violators to collect and disburse the information to the entity that has paid to receive this information. Lastly, the information is commercially monetized. Under this regime, the onus is on the user to detect and report that their information has been accessed without their consent and establish that the information accessed is capable of being monetized.[xxiii]


The HIPAA regulates health care data of patients and methods of its transmission across jurisdictions. The liability only arises when there is breach of Individually Identifiable Health Information. The HIPAA within its ambit does not cover data collection through methods like geolocation, browsing history etc, used to recommend health ads to a user. The protection under the legislation only arises, for e.g., when a user uses a health product/ service based on a targeted ad and the health company in turn misuses their data leading to a data breach.


The USA model thus places a double layered burden on the user before protection of law comes to their aid. Firstly, the onus to report and prove the breach lies on the user. Secondly, the information leaked should be capable of being monetized commercially.  Therefore, only commercially viable leaked private information is protected in the USA.


The GDPR and the E.U. Model of Legislative Mandated Self-Regulation

The Court of Justice of European Union has held that the user information processed by data companies is liable to be protected under Fundamental Right to privacy and protection of personal data.[xxiv]


The EU recently adopted an exhaustive legislation related to processing of personal data and its free movement, namely, General Data Protection Regulation (GDPR).[xxv] The legislation defines personal data as, “information about a particular natural person that allows, or could allow identifying the person.”


GDPR states that data processing is designed to serve mankind and therefore the right to protection of personal data is not absolute. Cross border flow of data is central to today’s economy and the onus is on the state to proportionally balance these two contesting issues.


The GDPR applies to all sectors of the economy including health care industry. The legislation recognizes many user friendly rights such as, Right to be forgotten, Subject Access Right, Right not to be subject to profiling, and the right to not use data beyond its original purpose. The law recommends encryption and pseudonymization for data transmission to protect user data. The GDPR empowers the user to become aware about their data and its monetization by data companies for targeted advertising. Due to the universal nature of this law leaving limited scope for exclusions, new methods of data collection are also governed by the legislation. The approach is more democratic as the system is delineated and doesn’t give unreasonable powers to the administration. This is because the rights enshrined are not just for the users but also for the data companies. The regulations places the burden on the data companies to self comply with the law or face penalty to the extent of expulsion from the market.


Indian Policy Conundrum

The judgment of the Supreme Court in Justice K. S. Puttaswamy (Retd.) & Another v. Union of India & Ors. recognized the right to privacy as a fundamental right under the Indian Constitution.[xxvi] Legitimizing the concept of Informational Privacy, involves the model of targeted advertising by collection of user data as a means to gain knowledge. The judgment called for a balancing act between the right to privacy and collection of data. The act of non consensual sharing of user data was recognized as a violation of the right to privacy. Indian Jurisprudence has recognized the principle of self regulation by companies[xxvii] above the mechanism of notice and take down as the most viable methodology to protect user information online.[xxviii] The court rejected the restricted access principle which means that once private data is shared the individual loses all control over the same.[xxix] The draft data protection law recognizes health data as sensitive and calls for higher protection for the same.[xxx] It also recognizes sharing information as a virtue that is beneficial to the larger community as long as the trust reposed with the data company is not breached.


Conclusion

In today’s data economy even the most inconsequential data holds value. Our data has the capability of making our lives easier but also vulnerable to big corporations and tech giants looking to profit from our information. Governments thus have an added responsibility of protecting their citizen’s privacy, reputation and safety online. Data companies must also be more ethical in collecting our data for advertising and commercial purposes. Monitoring and tracking an individual for the purposes of targeted health advertisements should not take the shape of digital surveillance. Most importantly, consumers must also be wary of the (health) advertisements they see and make informed choices for themselves.


It remains to be seen how data privacy laws ultimately shape themselves in India. On similar lines as that of GDPR, the Personal Data Protection Bill, 2019 in India needs to treat privacy on the principles of social contract theory. Under this model, all stakeholders, i.e. the users, data/health companies and the state collectively share the burden for the safe transmission of data. Such a structure builds a responsible self regulatory apparatus while leaving enough room for innovation and growth of businesses. This model is distinct from the control regime of privacy regulation model, where all the power vests in the authority to regulate and adjudicate disputes before it. Control regime of privacy regulation model increases the risk of over-centralization of data, favoritism, and abuse of powers, resulting in limited consumer choices and competition between market players.


Targeted health advertisements are a means to a more connected health infrastructure. Strong privacy laws can keep this medium free of abrasive forces. As ultimately, data sharing should enhance an individual’s autonomy. Furthermore, it should aid in building a stronger internet user community as opposed to falling prey to profiteers.

 

[i] ‘Digital Health Market Size Worth $509.2 Billion By 2025 | CAGR: 27.7%’ (Grand View Research, May 2019) <https://www.grandviewresearch.com/press-release/global-digital-health-market> accessed 18 May 2020.

[ii] Jennifer Bresnick, ‘Should cash-strapped small practices turn to ad-supported EHRs?’ (EHR Intelligence, 22 January 2013) <https://ehrintelligence.com/news/should-cash-strapped-small-practices-turn-to-ad-supported-ehrs> accessed 18 May 2020

[iii] Darla Cameron, ‘How Targeted Advertising Works’ (The Washington Post, 22 August 2013) <https://www.washingtonpost.com/apps/g/page/business/how-targeted-advertising-works/412/> accessed 18 May 2020

[iv] Annika Richterich, ‘Big Data-Driven Health Surveillance’ in The Big Data Agenda: Data Ethics and Critical Data Studies (London University of Westminster Press 2018).

[v] Sabyasachi Dash, Sushil K Shakyawar, Mohit Sharma and Sandeep Kaushik, ‘Big data in healthcare: management, analysis and future prospects’ (2019) 6 Journal of Big Data 54.

[vi] Shahid Akhter,  ‘Kerala’s digital transformation will empower and impact healthcare delivery in a big way: Rajeev Sadanandan’, (The Economic Times, February 09, 2017) <https://health.economictimes.indiatimes.com/news/industry/keralas-digital-transformation-will-empower-and-impact-healthcare-delivery-in-a-big-way-rajeev-sadanandan/57052248> accessed 18 May 2020.

[vii] Information Technology Act 2000, S. 43A

[viii] Health-Tech Ikigai Law, ‘DISHA and the draft Personal Data Protection Bill, 2018: Looking at the future of governance of health data in India’,  (Ikigai Law, 25 February 2019) <https://www.ikigailaw.com/disha-and-the-draft-personal-data-protection-bill-2018-looking-at-the-future-of-governance-of-health-data-in-india/#acceptLicense> accessed 18 May 2020.

[ix] Lucas Mearian, ‘Yes, Google’s using your healthcare data – and it’s not alone’, (ComputerWorld,  15 November 2019) <https://www.computerworld.com/article/3453818/yes-googles-using-your-healthcare-data-and-its-not-alone.html> accessed 18 May 2020.

[x] Susan Krashinsky, Google broke Canada’s privacy laws with targeted health ads, watchdog says’ (The Globe and Mail, 15 January 2014) <https://www.theglobeandmail.com/technology/tech-news/google-broke-canadas-privacy-laws-with-targeted-ads-regulator-says/article16343346> accessed 18 May 2020./

[xi] Neil Raden, ‘Data brokers and the implications of data sharing – the good, bad and ugly’(Diginomica, 19July 2019) <https://diginomica.com/data-brokers-and-implications-data-sharing-good-bad-and-ugly> accessed 18 May 2020.

[xii] Patientory Inc., ‘Data Brokers Have Access to Your Health Information, Do You?’ (Medium, 16 November 2018) <https://medium.com/@patientory/data-brokers-have-access-to-your-health-information-do-you-562b0584e17e> accessed 18 May 2020.

[xiii] ‘Google to ban ads for unproven medical techniques’ (The Economic Times, 7 September 2019) <https://brandequity.economictimes.indiatimes.com/news/advertising/google-to-ban-ads-for-unproven-medical-techniques/71023621> accessed 18 May 2020

[xiv] ICCPR (adopted 16 December 1966, entered into force 23 March 1976) 999 UNTS 171 art 19(2); Rios et al v. VenezuelaIACtHR (2009) Series C No 194, [346]; HRC, ‘General Comment 34’ (12 September 2011) UN Doc CCPR/C/GC/34.

[xv] Einarsson v. Iceland App no 24703/15 (ECtHR, 7 November 2015).

[xvi]  Brandon T Crowther, ‘(Un) Reasonable Expectation of Digital Privacy’ (2012) Brigham Young University Law Review 343.

[xvii]  Fair Information Practice Principles Generally, Federal Trade Commission (USA) <https://web.archive.org/web/20090331134113/http://www.ftc.gov/reports/privacy3/fairinfo.shtm>

[xviii] Benjamin E. Hermalin and Michael L. Katz, ‘Privacy, Property Rights and Efficiency: The Economics of Privacy as Secrecy’ (2006) 4 Quantitative Marketing and Economics 209.

[xix] In Re Google Inc. Cookie Placement Consumer Privacy Litigation 988 F.Supp.2d 434(2013); Force v. Facebook Inc. 934 F.3d 53 (2019).

[xx] Jack Hirshleifer, ‘The Private and Social Value of Information and the Reward to Inventive Activity’ (1971) 61 American Economic Review 561.

[xxi] 18 U.S.C. §§ 2510-2523.

[xxii] Health Insurance Portability and Accountability Act of 1996, Pub.L. 104–191.

[xxiii] In Re Google Inc. Cookie Placement Consumer Privacy Litigation 988 F.Supp.2d 434(2013).

[xxiv]Google Spain v. AEPD and Mario Costeja González C-131/12 (CJEU, 13 May 2012).

[xxv] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 april 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing directive 95/46/ec (‘General Data Protection Regulation’) [2016] OJ L 119/1.

[XXVI] Justice K.S.Puttaswamy (Retd.) v. Union of India And Ors (2018) 1 SCC 809.

[xxvii] Sabu Mathew George v. UOI (2017) 7 SCC 657.

[xxviii] Shreya Singhal v. UOI (2015) 5 SCC 1.

[xxix] Justice K.S.Puttaswamy (Retd.) v. Union of India And Ors (2019) 1 SCC 1.


Authored by Tanya Singh and Kushagra Goyal, legal practitioners and graduates of National Law University Odisha. This blog is part of the RSRR Blog Series on Digital Healthcare in India, in collaboration with Nishith Desai Associates.

Mailing Address

Rajiv Gandhi National University of Law,

Sidhuwal - Bhadson Road, Patiala, Punjab - 147006

Subscribe to RSRR

Thanks for submitting!

Email Us

General Inquiries: rslr@rgnul.ac.in

Submissions: submissionsrslr@rgnul.ac.in

Follow Us

  • LinkedIn
  • X
  • Instagram

Copyright © 2023 RGNUL Student Research Review (RSRR). ISSN: 2349-8293.

bottom of page