AI Model Used By Hospitals Caught Making Up Details About Patients, Inventing Nonexistent Medications and Sexual Acts
Health Scare In a new investigation from The Associated Press, dozens of experts have found that Whisper, an AI-powered transcription tool made by OpenAI, is plagued with frequent hallucinations and inaccuracies, with the AI model often inventing completely unrelated text. What's even more concerning, though, is who's relying on the tech, according to the AP. […]
Link :
https://futurism.com/the-byte/whisper-nabla-hospital-ai-details-patients