Australian Health Practitioner Regulation Agency - Case studies
Look up a health practitioner

Close

Check if your health practitioner is qualified, registered and their current registration status

Case studies

Examples of newer generative AI tools used in healthcare

A practitioner dictates clinical findings to a non-health specific generative AI tool (for example, ChatGPT) and asks for a clean and grammatically correct version that is then pasted into the clinical notes

This use of AI raises potential concerns about privacy, confidentiality and patient consent for a practitioner to consider. Before any use of generative AI, the practitioner should have explained to their patients/clients how AI will be used and obtained informed consent for its use.

If using generative AI tools to document patient health records, practitioners need to be aware of how and where patient data will be stored and for how long. Generative AI tools, such as ChatGPT may store data outside Australia. Any personal information entered into an AI tool which stores data offshore could lead to unintentional breaches of Australian privacy laws. Often information provided to public generative AI software becomes owned by the software, so it’s also important practitioners understand whether data they’re inputting to AI tools can be accessed, used (including sold) by the developer at their own discretion and without the patient’s consent.

Be aware of the consequences if the data is not subject to Australian privacy and data storage laws. Practitioners may not realise that use of patient data in this way requires explicit informed consent of the patient/client that their data could be used, viewed and/or sold by others.

Additionally, as this tool is not trained on clinical knowledge it may mis-interpret certain terms, abbreviations, or even words that have more than a clinical meaning. This highlights the need for the practitioner to verify accuracy of the transcript as well.

A practitioner uses a health scribing tool to automatically generate clinical notes and a referral letter, which includes an additional plausible diagnosis generated by the tool.

This scenario raises several concerns. Firstly, this scenario raises similar concerns as case study1 about privacy and data awareness, and the need for health practitioners to seek informed consent to input confidential patient/client data into an AI tool.

Secondly, the practitioner needs to be aware if they are using a tool that is regulated by the TGA. Some practitioners may assume that the ’intended use’ of all AI scribing tools is limited to scribing, however since this tool suggested a diagnosis it meets the definition of a medical device and therefore should be regulated by the TGA. This type of tool must undergo premarket approval and be included in the ARTG prior to supply in Australia unless an exemption applies. If unsure, practitioners can check with the vendor or search the ARTG to check if the tools they are using are registered.

This case also flags known errors with AI scribing tools that many practitioner may not be aware of. In addition to adding a diagnosis, errors can include ’hallucinations’ which are incorrect or misleading results that are AI generated but may appear factual, and errors in missing information. Regardless of the technology errors, the practitioner remains responsible for the health records generated and for deciding whether or not to accept a diagnostic recognition suggested by the tool. This highlights the importance for practitioners to check records and other documentation generated by AI for accuracy.


1. Coiera E. Artificial Intelligence in Healthcare. Presented at: Medical Board of Australia National Conference; 2024; Macquarie University.
 
 
Page reviewed 22/08/2024