Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said – The Independent

From Google: 2024-10-26 00:15:41

Researchers have discovered that an AI-powered transcription tool used in hospitals has been found to create sentences that were never actually said by patients. This raises concerns about the accuracy and reliability of such technology in the healthcare setting.

The tool is designed to transcribe doctor-patient interactions but has been found to fabricate statements, including medical information that was never mentioned. This could potentially lead to misdiagnosis or incorrect treatment plans if healthcare professionals rely on the tool for accurate information.

The study, published in the journal Nature, found that the AI-generated transcriptions were often inaccurate and contained misinformation. Researchers are cautioning against the use of such tools in critical healthcare settings until they are able to improve the accuracy and reliability of the technology.

Healthcare professionals are advised to be cautious when using AI-powered transcription tools and to verify information independently to ensure accuracy and prevent potential errors in diagnosis and treatment. Researchers are calling for further development and testing of these tools to improve their reliability and usability in the healthcare industry.



Read more at Google: Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said – The Independent