Security threats from third parties are one of the most critical risks facing the healthcare industry. Vendors’ growing use of artificial intelligence adds a new layer of third-party concerns, said independent consultant Rick Doten, former CISO of a large health care company.
Healthcare vendors that handle HIPAA-protected health information should face close scrutiny from their healthcare customers over how those companies use AI models, the data they collect and how AI-based agents interact with sensitive systems and accounts, Doten said.
“What AI models are you using? Are they public? Are they private? Are you using platforms that leverage AI? Are you doing this for analytics?” are among the questions healthcare entities should ask third-party contractors, he said.
“It’s not just about data protection, but also about its appropriate use,” he said. “Is the AI collecting PHI when it shouldn’t be? Are you using agents to be able to do processes that may be registering or using accounts or having access to information that may not be necessary for the process, but is just also sort of fed into the mesh of whatever it’s doing,” he said.
In the audio interview with Information Security Media Group (see audio link below photo), Doten also discussed:
- Dealing with vendors during disruptive security incidents;
- Resources to help small hospitals and other healthcare providers better manage their overall cybersecurity risk;
- Why HIPAA Security Risk Analysis is So Difficult to Perform for Many Regulated Entities.
Doten is the former CISO and vice president of information security at Centene Corp. He also previously worked as a virtual CISO with international companies. He is a member of the Cloud Security Alliance’s CXO Trust Advisory Board, as well as the boards of its local Charlotte ISC2 and CSA chapters. He works with several venture capital and commercialization firms examining security technologies, as well as on the advisory board of several startups.
