In our review, we found consistent guidance on using AI in physicians’ practices that centres on three key obligations. Using AI for charting does not change physicians’ duties to secure patient consent, preserve patient privacy, and maintain accurate records of patient interactions.
Privacy
Your obligations to ensuring the privacy of patient’s personal health information is no different with AI than in any other area. AI tools can add another layer of complexity to this duty. AI tools often rely on a central or cloud-based tool to process and analyze the information it collects locally on your computer. This means patient information, and perhaps even a recording of the patient, might be transmitted to a vendor’s server, which could be located in another country.
- Sometimes, this information is retained to help the AI system learn and improve. Even without their name or personal health information number being transmitted, a patient could be identifiable based on the clinical uniqueness of their case.
- CMPA advises that physicians should“consider whether the AI scribe they intend to use meets the applicable privacy requirements in their jurisdiction. For example, privacy legislation generally requires that custodians adopt reasonable safeguards to protect personal health information under their control.” It is important to ask how an AI vendor meets local privacy legislation and standards.
- If you practice in a hospital or facility where you are not the custodian of patient records, you should first receive permission to use an AI tool.
A reminder that“HIPPA compliance” refers to American privacy standards, not Canadian. Manitoba physicians must comply with Canada’s privacy laws and Manitoba’s PHIA.
Accuracy
As a physician, you are responsible for your documentation. If you are assisted in crafting your patient notes, whether from AI or a human, you must review, edit and finalize all notes under your name.
➡ The Canadian Medical Protective Association (CMPA) does not oppose the use of AI scribes but reminds physicians to ensure patient chart entries accurately reflect the patient encounter. Physicians should review and correct any AI-generated text for the patient’s chart in a timely way, just as you would if a resident or human scribe had helped prepare the notes. You can review the CPSM Standard of Patient Documentation here.
Despite how far the tools have come, generative AI has limitations and risks, especially when applied to writing patient documentation. Aside from privacy concerns, AI risks can include:
- Misinterpreting the context or discussion.
- Accessing and incorporating misleading, incorrect or outdated medical information.
- “Hallucinating” by generating false information presented in a confident and convincingly way.
- Bias, as generative AI tools rely on information published elsewhere without the ability to identify potential biased, discriminating or stereotyping content.
- A lack of regulation or standards for this rapidly evolving technology, which means it can be difficult to know whether a specific AI product meets the standards, ethics, accuracy, and performance you expect from other technologies in health care.
To reiterate, it is ultimately your responsibility to must review, edit and finalize all notes under your name.
Patient Consent
If you are using AI to assist with charting, you should first advise the patient and seek their consent personally. This is especially important if the AI tool you are using records or transcribes the full conversation before generating a summary for your chart documentation. Some AI scribe programs record and preserve the encounter while others transcribe in real time and do not create a recording. Ensuring consent is informed means being ready to explain how privacy is maintained and safeguards to ensure the record is accurate.
- CMPA says it plainly:“prior to making any recording of a clinical encounter, you should obtain patient consent.” They also note even“if the data may also be de-identified and used to improve the algorithm (i.e., allow the AI scribe to“learn”), this should also be explained to the patient.”
- If a patient does not consent, or raises questions or concerns you do not feel comfortable in answering, you should not use the AI Scribe for the interaction.
- The patient’s consent should be obtained by the physician and noted in the chart. It may be helpful to post signs in your clinic about the use of AI Scribes and provide a brochure or handout to patients. However, these efforts complement, and do not replace, your discussion with the patient.
- Patient consent is not required for each interaction, as the consent should be documented the first time. However, physicians should use their best judgement as to if there are any changes in the patient, or the issues discussed that may change the patient’s comfortability with AI use.
See Canada Health Infoway’s Patient FAQ, consider having this on hand for hesitant patients to help inform them on AI use in medicine.