Search

NOTE: This article presents preliminary guidance and is subject to change based on physician feedback and any new regulatory or legal standards in this rapidly evolving space.

In a recent survey of physicians, we found over 90% of physicians see some potential for artificial intelligence to help their medical practice in some way, including 70% who see an opportunity when it comes to charting and documentation. It’s no surprise, then, that Doctors Manitoba has been receiving inquiries on a regular basis from physicians asking about using AI for charting and other purposes, especially considering the 10+ hours per week physicians spend on administrative tasks. As we work together to find ways to reduce the administrative burdens for physicians at a systemic level, we’re taking a look at some of the responsibilities that arrive with using AI.

Based on the growing interest, Doctors Manitoba is undertaking work to provide guidance to physicians about using AI as a scribe to assist with their documentation.

Within Manitoba, there are currently few, if any, explicit limits or constraints on using AI to assist with charting and documentation. While the College of Physicians and Surgeons of Manitoba does not currently prevent the use of AI tools by physicians, we strongly recommend a cautious approach that is consistent with the professional standards the College and your patients already expect.

Our review of organizations from other jurisdictions found a few that do offer guidance, with consistent issues physicians should consider in their practice. It’s important to consider both the benefits and the risks.

The potential benefits of AI related to charting are numerous.

Top of the list, of course, is time saving. Some physicians report spending an hour or more per day on charting. Sometimes, this is completed late in the evening after putting their kids to bed. Others block off one full day a week to complete charting and other administrative tasks. Some physicians report that an AI scribe is dramatically cutting back the time they spend on charting, using much less time to review the draft, update it, and finalize it in the patient’s chart. Of course, less admin time has been proven to decrease physician burnout as well.

Others have reported that using an AI scribe helps to decrease the distraction of typing or taking notes during each patient visit, allowing them to listen more attentively and build a stronger relationship with each patient.

Some have also reported that the quality and accuracy of their charting has improved, with some AI scribes able to efficiently pull in complex diagnostic, treatment and medication information. In fact, In a study published in JAMA Internal Medicine, University of California researchers compared responses to patient questions collected from the AskDocs community on Reddit and the same questions posed to ChatGPT, a large language model AI tool. A panel of doctors reviewed the answers and determined ChatGPT’s answers were more detailed and rated far higher in both quality and empathy than the responses from the actual doctors.

Together, these benefits can help to improve access and quality for patients. However, like any other improvement in medicine, the benefits must be considered with the risks and professional obligations.

This is why other organizations have stressed a cautious approach with introducing AI into medicine. Just last year, the World Health Organization issued a bulletin calling for the safe and ethical use of AI for health, proposing that clear evidence of benefit be measured before their widespread use in routine health care and medicine – whether by individuals, care providers or health system administrators and policy-makers.”

The American Medical Association notes that physicians should understand that new technologies present new risks, particularly if they are using [them] in direct patient care. Physicians need to understand these risks and carefully weigh the benefits of engaging with these new tools prior to integrating them in their workflow.”

Your Responsibility: Consent, Privacy, and Accuracy

In our review, we found Colleges in BC and Alberta have provided guidance on using AI in physicians’ practices, as has CMPA. We also found guidance from the American Medical Association. A consistent theme across these groups: making use of AI for charting does not change physicians’ duties to secure patient consent, preserve patient privacy, and maintain accurate records of patient interactions.

Privacy

Your obligations to ensuring the privacy of patient’s personal health information is no different with AI than in any other area, but AI tools can add another layer of complexity to this duty.

AI tools often rely on a central or cloud-based tool to process and analyze the information it collects locally on your computer. This means patient information, and perhaps even a recording of the patient, might be transmitted to a vendor’s server, which could be located in another country. Sometimes, this information is retained to help the AI system learn and improve. Even without their name or personal health information number being transmitted, a patient could be identifiable based on the clinical uniqueness of their case.

CMPA advises that physicians should consider whether the AI scribe they intend to use meets the applicable privacy requirements in their jurisdiction. For example, privacy legislation generally requires that custodians adopt reasonable safeguards to protect personal health information under their control.” It is important to ask how an AI vendor meets local privacy legislation and standards.

If you practice in a hospital or facility where you are not the custodian of patient records, you should first receive permission to use an AI tool.

A reminder that HIPPA compliance” refers to American privacy standards, not Canadian. 

Accuracy

As a physician, you are responsible for your documentation. If there was assistance in crafting your patient notes, whether from AI or a human, you must review, edit and finalize all notes under your name.

The CMPA also does not oppose the use of AI scribes but reminds physicians to ensure patient chart entries accurately reflect the patient encounter. Physicians should review and correct any AI-generated text for the patient’s chart in a timely way, just as you would if a resident or human scribe had helped prepare the notes. You can review the CPSM Standard of Patient Documentation here.

Despite how far the tools have come, generative AI has limitations and risks, especially when applied to writing patient documentation. Aside from privacy concerns, AI risks can include:

  • Misinterpreting the context or discussion
  • Accessing and incorporating misleading, incorrect or outdated medical information
  • Hallucinating” by generating false information presented in a confident and convincingly way
  • Bias, as generative AI tools rely on information published elsewhere without the ability to identify potential biased, discriminating or stereotyping content.
  • A lack of regulation or standards for this rapidly evolving technology means it can be difficult know whether a specific AI product meets the standards, ethics, accuracy, and performance you expect from other technologies in health care. 

Patient Consent

If you are using AI to assist with charting, you should advise the patient and seek their consent. This is especially important if the AI tool you are using records or transcribes the full conversation before generating a summary for your chart documentation. Some AI scribe programs record and preserve the encounter while others transcribe in real time and do not create a recording. Ensuring consent is informed means being ready to explain how privacy is maintained and safeguards to ensure the record is accurate.

CMPA says it plainly: prior to making any recording of a clinical encounter, you should obtain patient consent.” They also note even if the data may also be de-identified and used to improve the algorithm (i.e., allow the AI scribe to learn”), this should also be explained to the patient.”

Bottom Line

Remember, the onus remains on physicians, and likely will remain so in the future, to ensure any AI-generated entry is correct and obtained with consent, while maintaining the privacy of patient’s information.

If using AI in your practice, you should ensure:

  • Patient privacy is protected.
  • Any chart note or documentation is reviewed, evaluated for bias, and is accurate. 
  • Patient consent for the use of the technology.

Doctors Manitoba will be here to help you adapt in a rapidly changing environment. Please share your questions and/​or your experience with AI by emailing adminburden@​doctorsmanitoba.​ca.

Common AI Terminology

  • Artificial Intelligence: the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings (Bri​tan​ni​ca​.com). This includes learning, reading writing, creating, and analyzing (IBM). The American Medical Association uses the term augmented intelligence” instead to focus on how AI enhances human intelligence rather than replaces it.
  • Generative AI: artificial intelligence capable of generating text, images or other data using generative models, often in response to prompts. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics. (Wikipedia)
  • Large Language Model (LLM): notable for its ability to achieve general-purpose language generation and understanding (Wikipedia). It uses deep learning techniques and massively large data sets to understand, summarize, generate and predict new content (Tech Target). LLMs are a type of generative AI used to generate text-based content.
  • Natural Language Processing (NLP): a computer capable of understanding” the contents of documents, including the contextual nuances of the language within them. The technology can then accurately extract information and insights contained in the documents as well as categorize and organize the documents themselves (Wikipedia).
  • Automatic Speech Recognition (ASR): is a capability which enables a program to process human speech into a written format. While it’s commonly confused with voice recognition, speech recognition focuses on the translation of speech from a verbal format to a text one whereas voice recognition just seeks to identify an individual user’s voice (IBM).
  • ChatGPT: A well-known example of a generative AI tool using LLM proprietary software from OpenAI, allowing users to ask questions, provide context and refine results in a conversational format.
Last updated
March 4, 2024