A federal appeals court ruling in favor of the University of Chicago and tech giant Google in a case involving artificial intelligence (AI)-powered research would, at first glance, look like it provided a green light for providers to collaborate with technology companies on such research; however, it also raises numerous questions about the use of limited data sets and anonymized data by AI, two attorneys explained.
In the case, Dinerstein v. Google, the appeals court in July granted a motion to dismiss from the defendants, ruling that the plaintiff did not have standing.[1] But in doing so, the appeals court left unresolved various issues, said Beth Pitman, partner in Holland & Knight’s Healthcare Regulatory and Enforcement Practice Group.
“The court says it appears to be a win for privacy compliance officers, but we really caution against thinking that this creates an open highway with no red light for the disclosure of an EHR [electronic health record] or access to an EHR record for AI development,” Pitman explained in an Oct. 5 podcast.[2]
Google, U of Chicago Collaborated
In 2017, Google teamed up with the University of Chicago to develop software capable of anticipating patients’ future health care needs. “They wanted to be able to use their EMR [electronic medical record] information in order to engage in predictive analytics for health care,” Pitman said. “The collaboration focused on using machine learning techniques to predict hospitalizations identified by instances when a patient’s health is declining.”
The research venture had the potential, if successful, “to reduce medical complications, eliminate unnecessary hospital stays, and, ultimately, improve patients’ healthcare outcomes,” the appeals court wrote in its opinion.
As an initial step in this research effort, the university delivered years’ worth of anonymized patient medical records to Google, supplying the tech giant with the information needed to train its algorithms, according to the court’s opinion. “A Data Use Agreement governed the transfer. Restricting Google’s use of the records to a list of specific research-related activities, the agreement expressly prohibited the company from attempting to identify any patient whose records were disclosed,” the court said.
The anonymized electronic records included those of Matt Dinerstein, who had been an inpatient at the hospital twice during the period covered by the records disclosure. Dinerstein argued that the EMR records retained identifiable date stamps and doctor notes and sued Google and the university on behalf of himself and a class of other patients whose anonymized records were disclosed.
Initially, Dinerstein claimed that the university had breached either an express or an implied contract traceable to a privacy notice he received and an authorization he signed upon each admission to the medical center. He also asserted a claim for unjust enrichment; he alleged that the university had reneged on its promise of patient confidentiality and, therefore, violated the Illinois Consumer Fraud and Deceptive Business Practices Act.
Against Google, he asserted claims of unjust enrichment and tortious interference with his contract with the university. He also brought privacy claims against all defendants.
In an opinion handed down in September 2020, District Judge Rebecca Pallmeyer granted the defendants’ motion to dismiss, finding that the consumer fraud claim should be dismissed for lack of standing and the rest of the suit should be dismissed for failure to state a claim.[3] The U.S. Court of Appeals for the Seventh Circuit agreed with the district judge’s decision to dismiss the case but disagreed with her reasoning, saying instead that Dinerstein had not adequately alleged standing to pursue any of his claims.