We discussed this whole Alexa and HIPAA thing before.  This week came the big announcement from Amazon that had headlines telling us that Alexa is HIPAA compliant with some slick new medical skills. Time to talk about her again.  Let’s see what the announcement really said.

Alexa and HIPAA

A 5 star review is all we ask from our listeners. Really.
Free HIPAA Training
Delivered to your inbox every Friday

In this episode:

Alexa and HIPAA Round 2

Today’s Episode is brought to you by:

Kardon 

and

HIPAA for MSPs with Security First IT

 Subscribe on Apple Podcast. Share us on Social Media. Rate us wherever you find the opportunity. 

Next HIPAA Boot Camp

Session #2 – May 15, 16, 17

 Early Bird ends April 23 

Session #3 TBD

Somewhere and sometime after Labor Day

www.HelpMeWithHIPAA.com/bootcamp

Share us with one person this week!

Like us and leave a review on our Facebook page: www.Facebook.com/HelpMeWithHIPAA

Before we get started there is a great home devices story we have to share.  Woman calls 911 over home invader. It was a Roomba. Grime doesn’t pay.  This one is just a great tale about us getting used to living in a home full of robots.

Are Alexa and HIPAA Finally Together?

We did an episode on Alexa and HIPAA almost 2 years ago with Alexa and HIPAA Plus Other Questions – Ep 117.  At the time, we discussed the way that Alexa works and several other reasons that we have privacy and security concerns.

Before we go too far, though, let me say that I think the concept and a lot of the things you can do with all of these voice activated devices is just awesome.  I love being able to just say out loud what I need and get a response from someone.  It feels like she cares about me and wants to help me.

It is also a very good solution for certain situations and people.  My elderly in-laws love theirs primarily for music.  They are especially helped by the device, though.  “She” reminds them to take meds, run errands, tells them the weather forecast, when something is on TV and where, and plays any music that suddenly pops into their head.  As long as they can remember her name, she usually gives them what they ask.

Do I love the concept?  Oh, yeah!  I want one really badly!  

Is there one in my home?  Nope, not until some of my concerns are addressed.

The device has been subpoenaed in a murder trial, it has done creepy random laughs because it thought people asked it to do it, and it even sent a recorded conversation from one couple to a completely random other person in their contacts.

We sure got a burst of social media mentions and direct messages when the news came out that Alexa has rolled up in here with some “HIPAA compliant” medical skills, I think it is time to talk about her some more.

There were some great headlines like “Alexa can you spell HIPAA?”, “‘Alexa, find me a doctor’” and “What the HIPAA-compliant Alexa skills mean for the future of voice in healthcare”.  It was all very exciting indeed.

I need to point out that while those are the headlines, the details in the article say it differently.  Things like saying Alexa now has “HIPAA-compliant data transfers” is in the detail but not the headlines.  It also uses statements like these new skills are “now operating in our HIPAA-eligible environment.”

We have a little bit of work to get from these statements to a fully “HIPAA compliant” solution, wouldn’t you say?  If you only read the thousands of headlines crowing the mantra you may miss the details like these.  I wanted to know what they mean when they make HIPAA announcements with such specific language.

First, let’s recap what our biggest issues are with the service.  The way it works.  It takes recordings that it determines may include you triggering it and sends them to the cloud for processing and it stores them on your account.  It sometimes gets information that had nothing to do with it included in those conversations.  We didn’t like that part at all.

The official Alexa and HIPAA announcement actually states:

We’re excited to announce that the Alexa Skills Kit now enables select Covered Entities and their Business Associates, subject to the U.S. Health Insurance Portability and Accountability Act of 1996 (HIPAA), to build Alexa skills that transmit and receive protected health information as part of an invite-only program.

With the invite-only thing going on we don’t get all the backend details just yet.  Basically, the concept of an Alexa “skill” is you create code that can be added to her list of things she knows how to do.  The code is triggered by some specific statement like “find me a doctor”.  When that is processed it takes the information it knows from you and sends it over to a web service run by someone else that provides the answers and sends it back over to Amazon and back to Alexa.  The process continues as a conversation that goes from the machine, through Amazon, to the skill provider, and back.

It is very cool that you can do this and they are building Alexa into other devices besides speakers.  The skills are hosted by the providers or AWS Lambda services.  That may be what they have created differently that is the “HIPAA-eligible environment”.  The data-transfer bit is the conversation between all those services being properly encrypted.  It is the middle bit that we don’t have full details on or I haven’t seen them if they are out there.  Unfortunately, that is the part we have always worried about the most.  What happens when those conversations are in the Amazon world.

Our concerns were not in any way made better when the next set of Amazon headlines hit.  It was really inconvenient timing these two stories.

Alexa and HIPAA, Maybe Not So Much

Amazon reportedly employs thousands of people to listen to your Alexa conversations was the big CNN headline within a week of the HIPAA skills headlines.  The story was broken by Bloomberg.  Here are the bits of the story that got my attention.

Amazon reportedly employs thousands of full-time workers and contractors in several countries, including the United States, Costa Rica and Romania, to listen to as many as 1,000 audio clips in shifts that last up to nine hours. The audio clips they listen to were described as “mundane” and even sometimes “possibly criminal,” including listening to a potential sexual assault.

Amazon doesn’t “explicitly” tell Alexa users that it employs people to listen to the recordings. Amazon said in its frequently asked question section that it uses “requests to Alexa to train our speech recognition and natural language understanding systems.”

People can opt out of Amazon using their voice recordings to improve the software in the privacy settings section of the Alexa app.

Bloomberg said that Alexa auditors don’t have access to the customers’ full name or address, but do have the device’s serial number and the Amazon account number associated with the device.

Sorry, but that was a mic drop for me, at least for now.  I love the concept it is just how little we know about what is going on behind the scenes with these things.  I am going to need a lot more information before I can recommend CEs and BAs use these devices within listening distance of PHI.  At least not until I can feel like my clients aren’t being hung out to dry in the latest tech deal that puts privacy and security last, not first in line while planning development.

We should also mention that Google’s competitor to Alexa, Google Home and Google Assistant, are not covered by Google’s BAA and therefore not HIPAA compliant alternatives for this reason alone. There’s really no need to discuss those devices beyond that, at the moment anyway.

Remember to follow us and share us on your favorite social media site. Rate us on your podcasting apps, we need your help to keep spreading the word.  As always, send in your questions and ideas!

HIPAA is not about compliance, it’s about patient care. TM

Share This
HIPAA Boot Camp