Listen to this article
Estimated 4 minutes
The audio version of this article is generated by AI-based technology. Mispronunciations can occur. We are working with our partners to continually review and improve the results.
Artificial intelligence note-taking tools intended for use by Ontario doctors provided incorrect and incomplete information or demonstrated “hallucinations,” and were not evaluated adequately, the province’s auditor general says in a new report.
Ontario Auditor General Shelley Spence made the finding in the course of a broader probe of artificial intelligence use across the provincial public service.
During a provincial procurement process for AI medical systems that transcribe conversations between doctors and patients, government evaluators found serious errors in transcripts generated by 20 programs, she said.
“Inaccuracies in medical notes generated by AI scribe systems could potentially result in inadequate or harmful treatment plans that may potentially impact patient health outcomes,” the auditor’s report said.
“It is important that AI scribe systems are tested to provide assurances as to the quality of their generated notes to minimize inaccuracies.”
The evaluators ran a pair of test conversations between hypothetical doctors and patients through the AI systems to test their capabilities, and uncovered the problems, the report said.
Evaluators found that nine of the systems had so-called “hallucinations,” fabricating information or making suggestions to patient treatment plans not made by the doctors. Those included referring a patient for therapy or ordering blood tests, the report said.
Some AI scribes captured false drug prescriptions
Transcripts created by 12 of the 20 programs contained incorrect information like capturing a different drug than was prescribed by the doctor. Seventeen of the 20 systems missed key details about patients’ mental health issues.
The auditor also found some of the system vendors did not submit third-party audit reports, certifications or threat risk assessments during the procurement process. Four systems were still approved by the government.
The systems are in use across the province, however the government has issued guidelines to doctors for the manual review of AI-generated notes to ensure their accuracy. The auditor recommended that the province implement IT controls in the systems to enforce an attestation from doctors, to confirm they reviewed the notes.

Spence said she personally saw AI scribe systems in use during a recent doctor’s visit.
“I actually went to my doctor because you can hear that my voice isn’t exactly what it normally is … and they were using AI scribe,” she said. “So, I kind of mentioned, ‘Please look at the transcript when you’re done.’”
Spence said she isn’t recommending Ontarians follow her example, but instead is urging the government to thoroughly test its scribe systems to ensure “that we’re buying AI that works for the people of Ontario.”
Errors made during testing phase: minister
Stephen Crawford, Minister of Public and Business Service Delivery and Procurement, said the errors pointed out in the report were found during the testing phase of process.
“The doctors that go through and use this product oversee every aspect of it,” he said. “So, every decision that is made that comes out of any artificial intelligence anywhere is overseen by a professional.”
Crawford said the AI scribe systems are designed to save physicians time.
“That gives them more time to spend with their patients and less time in record keeping,” he said.
Health Minister Sylvia Jones did not participate in a government news conference to respond to the auditor general’s report on Tuesday.
On Wednesday, a spokesperson for the minister said in a statement that approximately 5,000 physicians across Ontario are currently using AI scribes and there have been no known reports of patient harm associated with their use in the province.
Doctors “must review and approve” all documentation before it is added to a patient’s medical record, Ema Popovic said in a statement.
“Use of this technology is entirely voluntary and requires patient consent, including informing patients how AI will be used, before it is used during an appointment,” she said.
Ontario’s auditor general says some doctors are billing impossibly high hours — and the province isn’t doing enough to catch it. Queen’s Park reporter Lorenda Reddekopp explains.
Green Party Leader Mike Schreiner said the audit results are “deeply disturbing” and raise patient safety issues.
“If the government’s going to utilize these tools, we need to make sure … they work properly before they’re deployed,” Schreiner said.
The auditor says she made 10 recommendations to the government and its procurement agency to improve the use of AI across the public sector, and it has agreed to nine.
The recommendations focus largely on increasing security and privacy.
