Would you all consent to having medical data use an AI Scribe service?

My psychiatrist’s company is using (EDIT: Deep Scribe Freed.ai and despite our current sessions being terribly boring these days about updates with fitness routines and slowly cutting back on dosages, I still discuss the medications I’m taking. While I actually deeply appreciate the problems this solves and the annoyances for clinicians to keep track of all these notes and do the notes as I have family working in the hospital, this just doesn’t sit too well with me.

Here’s the usage description from my clinician:

The AI platform that your provider is using has followed Health Insurance Portability and Accountability Act(HIPAA) compliance guidelines to ensure your data is secured and has a HIPAA compliant Business Associates Agreement in place. Your provider is committed to protecting the privacy and security of your Personal Health Information(PHI). Your PHI is identifying information about you that relates to your health or to the provision of health care to you. For example, during your appointment, you and your provider might discuss your mental health history, current problems and symptoms, and medication use; all of this information would be considered your PHI. AI technology will collect your PHI in as follows: It will create a word-for-word transcript of the dialogue between you and your provider during your appointment, using voice recognition software and other technologies. From an audio/visual recording of your appointment with your provider. At the conclusion of each appointment, AI technology uses the transcript from that appointment to generate a clinician note which summarizes the discussion between you and your provider during the appointment, including any action items that were discussed. The provider will analyze each appointment transcript to support the provider’s clinical documentation of the appointment. Your provider uses the appointment audio/visual recordings generated by Ai technology solely to confirm the correctness and completeness of appointment transcripts. After that, the audio/visual recordings are securely destroyed.

Retention:

Your provider will retain PHI only for as long as necessary to fulfill the purposes for which it was collected, unless otherwise permitted or required by law. For example, we will retain each audio recording of an appointment only for as long as necessary to confimr that the accompanying transcription only for as long as necessary for your healthcare provider to complete their clinical documentation for that appointment. After that, the transcript will be securely destroyed.

DeepScribe is teaming with Anthropic so likely that’s where they are getting their base model which is better than OpenAI/Microsoft or Google, but I have no clue if it is centralized or if it runs locally and if that will change.

FreedAI is teaming with Microsoft/OpenAI and so this is just definitely no.

Security overview if anyone was curious.

Cloud Hosting and Availability

All hosting services and data is stored and processed within Microsoft’s Azure secure data centers

Freed has a HIPAA-compliant Business Associate Agreement with Microsoft

Freed leverages Azure’s high-availability infrastructure to ensure the data is always accessible

HIPAA just feels like empty promises these days and Azure is the dullest tool in the shed when it comes to cloud data providers.

Artificial Intelligence

All AI models are HIPAA-compliant and don’t retain data

Protected health information is never used for AI training purposes

So clearly anonymized data will be used to train OpenAI models…bleh.

What do you all think? Should I be the likely one appointment in my psychiatrists month that requires them to take hand written notes?

Update: after realizing the tool they are using and what they suggest on how data is stored and processed, I’m not even asking the question any more, it’s a hard no for me. OpenAI/Microsoft is trying to get up to Google level PII and sensitive data. I’m going to be that person.

Even without getting into the privacy nightmare you just described, I can’t imagine how this tool is helpful. People are terribly inefficient at communicating their thoughts and hand written notes by someone who is actively listening to you will be far more helpful for them in the future than a word-for-word transcript or LLM generated summary.

I would never accept this kind of data being stored in the cloud whether it is encrypted or not. I suspect when they say end-to-end they mean in-transit encryption since the LLM is inevitably being fed an unencrypted audio file / plain text file.

Paper is king because paper can never be hacked and prevents introducing any third party.

2 Likes

To be fair, most notes are digitized, but when I say hand written, I mean typed in summary by the clinician, and then later (after the appointment or the end of that clinician’s shift) expanded upon to ensure it is easily read and understood by other clinician’s.

There are benefits to storing notes digitally like audit logs but it needs to be done property, ideally end-to-end encrypted and locally. I wouldn’t want it sent to a cloud based LLM. A transcript from the native voice recorder app on iOS / Pixels would be far preferable since that’s done locally.

Yeah, most folks use Epic which, depending on the hospital, will do this.