An Ontario audit discovered that AI-powered clinical note-taking systems frequently generate fabricated information, including nonexistent therapy referrals and incorrect prescriptions. The findings raise significant safety concerns about the reliability of these tools in healthcare settings, where inaccurate records could directly impact patient care and treatment decisions.
Why it matters: As healthcare providers increasingly deploy AI assistants for documentation, this audit demonstrates critical gaps in accuracy and accountability that could endanger patient safety and erode trust in AI-assisted medical practice.