More Huge News!!

Meeting Your Professional Obligations When Using AI in Healthcare

It’s more huge news this week!! AHPRA now have a page on “Meeting your professional obligations when using Artificial Intelligence in Healthcare.” Last week we announced that AHPRA had listed a new digital competency fact sheet mentioning AI: https://bit.ly/hugewinai. Now, in addition, AHPRA is explicitly describing our obligations. See their info page here: https://bit.ly/AIAHPRA

AHPRA notes that AI is "becoming rapidly integrated into many areas of healthcare". This necessitates further educational activities for AHPRA regulated clinicians into how they can or should use "AI in their practice".

AHPRA drew attention to the benefit of AI tools to:

  • "Improve client care and patient satisfaction by reducing admin burdens and health practitioner burnout". This means that AHPRA has noticed that AI tools can save clinicians time. I believe AI can save clinicians 5-10hrs+ per week.

Some main issues highlighted include:

  • Accountability: "The obligation on the clinician to ensure AI tools are suitable, to continue to apply human oversight and judgement, and ensuring accuracy."

  • The need to understand the use of AI tools

  • Being transparent to clients about the use of AI tools and about the need to provide clients with the necessary information to make an informed choice as to whether they are willing to give consent for AI tools to be used.

  • Informed consent. AHPRA suggests "If using an AI scribing tool that uses generative AI, this will generally require input of personal data and therefore require informed consent from your patient/client."

Standards will be an ongoing discussion for a long time. With legal help we have suggested some of our own AI standards here: https://bit.ly/aistan.

So what do we get out of this?

  1. More legitimacy. Following on from the digital competency fact sheet, this mention is another win for digital legitimacy. The Psychology and AI space is clearly getting recognition, but psychologists are urged to be cautious.

  2. A lack of negativity

    • AHPRA did not discourage the use of AI, or explicitly challenge current use of AI by Healthcare professionals, as long as adequate steps are in place.

  3. Australian Privacy Principles. There was no explicit mention of the need for AI tools holding personal information to be compliant with Australian Privacy Principles. However, this is an area where we urge extreme caution.

  4. Insurance

    • "Psychologists are encouraged to check with professional indemnity insurance providers to ‘ensure AI tools in practice are covered’."

  5. Governance

    • "Staff need to ensure they are using AI under the oversight of their employer."

Our recent legal event with Special Counsel Ashlee Provis suggested the need for AI tools using personal data to be consistent with Australian Privacy Principles, and the need for the development of an AI policy and Privacy Policy. You can still watch the event recording here: https://bit.ly/psychailegal

You can also watch or download my recent AI Basics and Getting Started event (run on 22 Aug) if you want to see some AI uses. The recording will remain available for 1 month from purchase. It can still be purchased until 26 September 2024. See: https://bit.ly/aibeginps

I look forward to providing further support to you and your organisation to understand your AI requirements and assisting you with planning next steps in your implementation. We offer a consultancy service for individuals or organisations as well as group training. Get in touch soon to book your time for a discussion as to your or your organisation’s needs.

Email: david@psychologysquared.com.au.

Previous
Previous

What is the alternative to Bad AI?

Next
Next

AI as a digital competency: Psychology Board of Australia