OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.
Errors and Hallucinations are definitely serious concerns, but my biggest concern would be privacy. If my GP is using AI, I no longer see my medical information as private, and that is unacceptable.
I work in judicial tech and have heard questions of using AI transcription tools. I didn’t believe AI should be used in this kind of high risk area. The ones asking if AI is a good fit for court transcripts can be forgiven because all they see is the hype, but if the ones responding greenlight a project like that there will be some incredibly embarrassing moments.
My other concern is that the court would have to run the service locally. There are situations where a victim’s name or other information is redacted. That information should not be on an Open AI server and should not be regurgitated back out when the AI misbehaves.
Don’t court stenographers basically use tailored voice models and voice to text transcription already?
Regular transcription software is finally respectable (the early days of dragon naturally speaking were dark indeed). Who thought tossing AI in the mix was a good idea?
God I hope this isn’t the AI plan that the NHS adopts
This is the AI plan every healthcare entity worldwide will adopt.
No joke. They are desperate for shit like this.