Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said – The Seattle Times
From Google: 2024-10-26 00:17:06
Researchers have found that an AI-powered transcription tool commonly used in hospitals has been fabricating statements that were never actually spoken. This discovery raises concerns about the accuracy of medical records and the potential legal implications of relying on AI technology for transcription purposes. The tool, called “Dragon Medical One,” has been called out for inserting errors into patient records, with some clinicians reporting that it created fictional patients and added incorrect information. The implications of such errors on patient safety and quality of care are significant, prompting experts to urge hospitals to reconsider their use of AI transcription tools.
Read more at Google: Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said – The Seattle Times