The EU’s endless regulation imposed on data usage has spooled over into academia, providing another lesson in kneecapping your own society by overregulating it. And they wonder why none of the big internet companies arose from the EU (or ever will). This time, the European data regulators seems to be doing everything they can to hamstring clinical trials and drive the research (and the resulting tens of billions of dollars of annual spend) outside the EU. That’s bad for pharma and biotech companies, but it’s also bad for universities that want to attract, retain, and teach top-notch talent.
The European Data Protection Board’s Opinion 3/2019 (the “Opinion”) fires an early and self-wounding shot in the coming war over the GDPR meaning and application of “informed consent.” The EU Board insists on defining “informed consent” in a manner that would cripple most serious health research on humans and human tissue that could have taken place in European hospitals and universities.
As discussed in a US law review article from Former Microsoft Chief Privacy Counsel Mike Hintz called Science and Privacy: Data Protection Laws and Their Impact on Research (14 Washington Journal of Law, Technology & Arts 103 (2019)), noted in a recent IAPP story from Hintz and Gary LaFever, both the strict interpretation of “informed consent” and the GDPR’s right to withdraw consent can both cripple serious clinical trials. Further, according to LaFever and Hintz, researchers have raised concerns that “requirements to obtain consent for accessing data for research purposes can lead to inadequate sample sizes, delays and other costs that can interfere with efforts to produce timely and useful research results.”
A clinical researcher must have a “legal basis” to use personal information, especially health information, in trials. One of the primary legal basis options is simply gaining permission from the test subject for data use. Only this is not so simple.
On its face, the GDPR requires clear affirmative consent for using personal data (including health data) to be “freely given, specific, informed and unambiguous.” The Opinion clarifies that nearly all operations of a clinical trial – start to finish – are considered regulated transactions involving use of personal information, and special “explicit consent” is required for use of health data. Explicit consent requirements are satisfied by written statements signed by the data subject.
That consent would need to include, among other things:
- the purpose of each of the processing operations for which consent is sought,
- what (type of) data will be collected and used, and
- the existence of the right to withdraw consent.
The Opinion is clear that the EU Board authors believe the nature of clinical trials to be one of an imbalance of power between the data subject and the sponsor of the trial, so that consent for use of personal data would likely be coercive and not “freely given.” This raises the specter that not only can the data subject pull out of trials at any time (or insist his/ her data be removed upon completion of the trial), but EU Privacy Regulators are likely to simply cancel the right to use personal health data because the signatures could not be freely given where the trial sponsor had an imbalance of power over the data subject. Imagine spending years and tens of millions of euros conducting clinical trials, only to have the results rendered meaningless because, suddenly, the trial participants are of an insufficient sample size.
Further, if the clinical trial operator does not get permission to use personal information for analytics, academic publication/presentation, or any other use of the trial results, then the trial operator cannot use the results in these manners. This means that either the trial sponsor insists on broad permissions to use clinical results for almost any purpose (which would raise the specter of coercive permissions), or the trial is hobbled by inability to use data in opportunities that might arise later. All in all, using subject permission as a basis for supporting legal use of personal data creates unnecessary problems for clinical trials.
That leaves the following legal bases for use of personal data in clinical trials:
-
a task carried out in the public interest under Article 6(1)(e) in conjunction with Article 9(2), (i) or (j) of the GDPR; or
-
the legitimate interests of the controller under Article 6(1)(f) in conjunction with Article 9(2) (j) of the GDPR;
Not every clinical trial will be able to establish it is being conducted in the public interest, especially where the trial doesn’t fall “within the mandate, missions and tasks vested in a public or private body by national law.” Relying on this basis means that a trial could be challenged later as not supported by national law, and unless the researchers have legislators or regulators pass or promulgate a clear statement of support for the research, this basis is vulnerable to privacy regulators’ whims.
Further, as observed by Hintze and LaFever, relying on “the legal basis involves a balancing test between those legitimate interests pursued by the controller or by a third party and the risks to the interests or rights of the data subject.” So even the most controller-centric of legal supports can be reversed if the local privacy regulator feels that a legitimate use is outweighed by the interests of the data subject. I suppose the case of Henrietta Lacks, if arising in the EU in the present day, would be a clear situation where a non-scientific regulator can squelch a clinical trial because the data subjects rights to privacy were considered more important than any trial using her genetic material.
So none of the “legal basis” options is either easy or guaranteed not to be reversed later, once millions in resources have been spent on the clinical trial. Further, as Hintze observes, “The GDPR also includes data minimization principles, including retention limitations which may be in tension with the idea that researchers need to gather and retain large volumes of data to conduct big data analytics tools and machine learning.” Meaning that privacy regulators could step in and decide that a clinician has been too ambitious in her use of personal data in violation of data minimization rules and shut down further use of data for scientific purposes.
The regulators emphasize that “appropriate safeguards” will help protect clinical trials from interference, but I read such promises in the inverse. If a hacker gains access to data in a clinical trial, or if some of this data is accidentally emailed to the wrong people, or if one of the 50,000 lost laptops each day contains clinical research, then the regulators will pounce with both feet and attack the academic institution (rarely paragons of cutting edge data security) as demonstrating a lack of appropriate safeguards. Recent staggeringly high fines against Marriott and British Airways demonstrate the presumption of the ICO, at least, that an entity suffering a hack or losing data some other way will be viciously punished.
If clinicians choosing where to set human trials knew about this all-encompassing privacy law and how it throws the very nature of their trials into suspicion and possible jeopardy, I can’t see why they would risk holding trials with residents of the European Economic Zone. The uncertainty and risk involved in the aggressively intrusive privacy regulators now having specific interest in clinical trials may drive important academic work overseas. If we see a data breach in a European university or an academic enforcement action based on the laws cited above, it will drive home the risks.
In that case, this particular European shot in the privacy wars is likely to end up pushing serious researchers out of Europe, to the detriment of academic and intellectual life in the Union.
Damaging friendly fire indeed.