A recent lawsuit filed in San Diego Superior Court alleges that Sharp HealthCare recorded conversations between doctors and their patients without written consent, using that information to document the visits with an artificial intelligence program developed by a private Pittsburgh company.
While the lawsuit focuses on a particular medical provider in San Diego County attempting to create a class-action lawsuit comprised of Sharp patients, it also sheds light on the quiet but widespread adoption of AI-powered clinical transcription software in mainstream medicine.
A survey of San Diego medical providers taken after the lawsuit was filed Nov. 26 shows several area providers use similar systems.
Rady Children’s Hospital in San Diego said in an email that it is “currently conducting a limited pilot of ambient tracing technology with clinicians required to obtain patient consent before use.” UC San Diego Health confirmed it uses a system called Nabla, which it installed “after a thorough security review.” The academic health system says it requires patient consent before using the system, which includes “annual written consent (plus) verbal consent from the patient and all parties present in the exam room at each visit.” Kaiser Permanente said in a statement that its clinicians “have access to a clinical documentation support tool that helps them securely capture initial clinical notes during patient visits, allowing them to focus more on patient care.” Kaiser says its system “requires the healthcare team to ask permission from our patients and others accompanying them before using the tool.”
Scripps Health, one of the four largest medical providers in the San Diego market, declined to discuss whether it would use such a system, saying in a statement that “this is not a topic we will discuss.”
Paradise Valley Hospital said it does not use any AI-based note-taking systems, while Palomar Health and North County Tri-City Medical Center did not respond to the question.
When asked to comment on the lawsuit’s allegation that it failed to properly inform patients about its medical documentation system, Sharp said that while “patient safety and privacy are our highest priorities at all times,” she is “unable to comment on pending litigation.”
The lawsuit, filed on behalf of Sharp’s patient Jose Saucedo by attorney Robert Salgado, seeks certification as a class action and alleges that Sharp violated medical privacy laws “by surreptitiously recording entire medical consultations using electronic recording devices and cloud-based processing systems, without notice or consent.”
The suit seeks unspecified compensatory and punitive damages, also saying the state penal code allows damages of $5,000 per violation.
The recordings, the lawsuit says, were forwarded to Abridge, a Pittsburgh-based technology company that recently received a major investment from the tech industry in its AI-based system, which it said in a Dec. 10 press release is now used “in more than 200 outpatient care settings annually.”
The company has published information on its website that helps its customers and the public learn about how it works.
An entry on “Recording Basics” in the company’s customer support center urges clinicians to “be sure to follow your organization’s recommended guidelines for patient consent,” even providing example language that might be used in such situations.
Abridge suggests that doctors could say, “I will use a tool that records our conversation to help me write my clinical note, so I can pay more attention to our conversation and less time on the computer. Is that okay with you?”
The technology uses an app installed on a doctor’s smartphone to perform the recording, which a list of “best practices” says should be placed “between you and your patient, without any obstruction.” Abridge also says on its website that its technology is “100% HIPAA compliant and uses industry best practices to protect patient information,” indicating that the data it collects “is always stored through secure channels” compliant with HIPAA, the Health Insurance Portability and Accountability Act of 1996, which prevents unauthorized disclosure of sensitive patient information.
But Abridge also indicates in private life policy for its website, it creates separate privacy agreements with each of its clients, asking patients to “refer to your provider’s Notice of Privacy Practices for information about how they treat your (protected health information).” Sharp lists a confidentiality agreement. policy on its website, although the document is dated April 14, 2003.
Abridge states that it complies with the California Consumer Privacy Act and its website it states that it meets the level two requirements for “System and Organizational Controls” created by the American Institute of Certified Public Accountants. These rules specify how digital information should be protected from unauthorized access, free from corruption and private. Abridge says its SOC 2 compliance has been “validated by an independent third-party security and privacy auditor.”
An audience statement that Abridge published on its website in 2020 states that it used 10,000 hours of transcribed conversations between doctors and patients to train its AI models, and that this “anonymized” information came from “fully informed and consenting patients.” A separate statement in 2020 indicates that all research and development uses anonymized data and that this information is “acquired with the consent of the patient”. But it’s unclear whether or not the company uses real-world conversations transmitted by its customers to Abridge’s servers for analysis to train future generations of the company’s models.
The company did not respond to a request for comment on the matter.
Privacy advocates increasingly concerned about the rapidly evolving world of AI as it enters the world’s most sensitive spaces.
Sara Geoghegan, senior legal counsel for the Electronic Privacy Information Center in Washington, D.C., said it is abundantly clear that patient permission is required when making recordings.
“Certainly, transparency is a necessary step, as is disclosure and meaningful consent,” Geoghegan said.
And this consent, she added, does not have to be obtained just once. It should be obtained on its own, she added, and not added to the voluminous administrative process that patients often have to go through during an office visit.
“It should be consent that is freely informed and can be overturned,” Geoghegan said. “Once every 10 years is not enough.”
The biggest problem, she added, will arise when the use of AI in healthcare moves from simple transcription to decision-making. Insurance companies have already been found to be using AI systems to deny claims, a practice that will become illegal in 2025 due to a new lawthe “Physicians Take Decisions Act,” which requires that medical necessity determinations be made “only by a licensed physician or licensed health care professional competent to evaluate the specific clinical issues involved in the health care services requested by the provider.”
“I think that’s where utilization matters and where the limitations of technology matter,” Geoghegan said. “To me, a doctor who does all the medical work but uses technology to take some of the notes is very different from a situation that involves generative AI, where a doctor has a conversation with a patient and then a generative AI tool is the one that diagnoses and reports and, you know, performs the tasks that belong to the doctor.
It does not appear that Abridge has crossed this particular Rubicon. The company’s public statements regarding its products indicate that the primary goal of the system is to accurately document patient-doctor conversations, but that doctors must review the information obtained and make necessary corrections before it is added to a patient’s official medical record.
