
At first glance, these sources don’t seem related. But when you connect them, they reveal a pattern we can’t afford to ignore — and it’s more unsettling than most of us would like to admit. It’s time for an honest, slightly uncomfortable conversation about where we are — and maybe to sit down and remember what mom and dad always said about choices and consequences… even if we really didn’t want to hear it.
In this episode:
Choices Have Consequences – Ep 549
Today’s Episode is brought to you by:
Kardon
and
HIPAA for MSPs with Security First IT
Subscribe on Apple Podcast. Share us on Social Media. Rate us wherever you find the opportunity.
Great idea! Share Help Me With HIPAA with one person this week!
Learn about offerings from the Kardon Club
and HIPAA for MSPs!
Thanks to our donors. We appreciate your support!
If you would like to donate to the cause you can do that at HelpMeWithHIPAA.com
Like us and leave a review on our Facebook page: www.Facebook.com/HelpMeWithHIPAA
When you see a couple of numbers on the left side of the text below click that and go directly to that part of the audio. Get the best of both worlds from the show notes to the audio and back!
Choices Have Consequences
[00:47]None of this is surprising — and that’s exactly the problem.
Mapping the susceptibility of large language models to medical misinformation across clinical notes and social media: a cross-sectional benchmarking analysis – The Lancet Digital Health
Your mother loves you … well, check that out, and maybe AI as well – Health Data Management
Doctors Are Quietly Using AI. Bans Won’t Fix That. | MedPage Today – MedPage Today (free account required)
- Healthcare cybersecurity conversations feel cyclical
- “We’re still talking about ransomware.”
- “We’re still talking about third-party risk.”
- Now we’re adding AI governance into the mix
The real question:
What will it take — a patient harm event? A regulatory hammer? A catastrophic outage?
As we approach our 11th year of this podcast in May, it’s hard not to notice that many of the issues we discussed in year one are still here. Ransomware. Vendor risk. Governance gaps. The tools have changed. The headlines have changed. But the underlying risks haven’t gone away.
At what point do we admit that we keep choosing to live with these risks?
And that’s the hard conversation.
The Problems We Never Fixed
[08:30] The Health-ISAC report shows that the risks we’ve been discussing for more than a decade are still very much alive. They’ve matured. They’ve adapted. But they haven’t been resolved. And until they are, we don’t get the luxury of distraction. Because when known risks are ignored, the outcome is rarely surprising.Ransomware Is Still Dominant
- 455 ransomware events in 2025
- Qilin went from 23 health sector victims in 2024 to 77 in 2025 — more than tripling its impact in just one year. This particular gang is super vicious with complex encryption and double extortion.
- Ransomware remains operational, not exceptional
Third-Party and Supply Chain Risk
- Episource exposure impacting 5.4 million individuals
- Cleo compromise bundling hundreds of victims
- CrowdStrike update disrupting 69 percent of surveyed organizations
- Resilience now as important as prevention
Medical Device and Infrastructure Fragility
- Legacy devices still active
- Windows end-of-life issues
- DICOM/PACS exposures
- Patient safety directly tied to cyber maturity
We haven’t stabilized the foundation — and we all know the strain is there. Burnout. Budget pressure. Vendor risk. Incident fatigue. And now we’re stacking a massive new weight on top of a structure that’s already stretched thin.
The Problems We’re Adding
[15:50] The Lancet study and the HDM articles drive home just how urgent this AI conversation has become. We can roll our eyes at the hype, but we cannot ignore the reality. AI is no longer emerging. It’s embedded. And pretending it’s not shaping patient care doesn’t make it less powerful — it makes it less governed.And if you think this is still theoretical, it’s not. Two-thirds of physicians are already using some form of AI in their workday. Some of it is sanctioned. Some of it isn’t. But it’s happening.
AI Adoption Is Already Here
- Two-thirds of physicians using AI tools
- 17 percent using unapproved tools
- Shadow AI is cross-generational
Reliability Concerns
- 31.7 percent acceptance of fabricated medical information in benchmarking
- Even high-performing models accept false statements
- Clinical tone increases susceptibility
Governance Gap
- Bans do not stop usage
- Shadow AI fills workflow pain points
- Governance must meet reality
Choices and Consequences
[23:57]The Pattern
- Known risks
- Repeated warnings
- Incremental responses
The Leadership Question
- What event will force meaningful change?
- Patient harm?
- Enforcement?
- Financial collapse from a systemic outage?
This is not a knowledge problem. It is a prioritization and governance problem. Because when a crisis hits, the money shows up. The consultants show up. Executive attention shows up. Accountability shows up. So we know the system can respond. The real question isn’t whether we can act — it’s whether we’re willing to act before we’re forced to.
Before the consequences choose it for us.
Don’t wait to have a heart attack before changing your diet… in other words.
At some point, we have to stop acting surprised. Ransomware isn’t new. Third-party risk isn’t new. Even AI isn’t “new” anymore — it’s just louder and faster. The real issue isn’t whether we know the risks. We do. The question is whether we’re going to make managing them a priority before a crisis makes it for us. Because when something finally breaks, the time, money, and attention will magically appear. The only question is whether we act now — or wait until the consequences decide for us.
Remember to follow us and share us on your favorite social media site. Rate us on your podcasting apps, we need your help to keep spreading the word. As always, send in your questions and ideas!
HIPAA is not about compliance,
it’s about patient care.TM
Special thanks to our sponsors Security First IT and Kardon.


