Australia’s aged care sector has come under intense scrutiny in recent years.
A federal government inquiry established in 2018 to examine systemic failures across the industry concluded three years later, making 148 recommendations for fundamental reform.
On the tech front, healthcare organisations, including those providing aged care services, are tackling fragmented digital infrastructures, escalating cyber risks, constrained budgets and a workforce stretched by administrative load, which slows down progress on improving outcomes.
New technologies are being deployed as part of aged care reform, but just last month, the efficacy and humanity of a new automated assessment tool caused controversy.
The Commonwealth Ombudsman said it was investigating an algorithm-based aged care assessment tool used to determine a recipient’s eligibility for in-home services.
Clinicians have been unable to override the algorithm, even when they believe an automated decision is incorrect.
Responsible innovation still has a place in aged care
Failures aside, innovations being tested by universities and their industry partners have the potential, if deployed and governed correctly, to significantly improve the care provided to older Australians.
Dr Alison Craswell, an Associate Professor at the University of the Sunshine Coast (USC), and her team are testing AI software that runs on cameras to monitor people with dementia and delirium in high-risk hospital settings. Initial testing is being conducted with the Sunshine Coast Hospital and Health Service.
The prototype algorithm detects increasing agitation in patients to help clinicians determine when intervention is required.
Initially, data has been fed into the algorithm to detect low levels of agitation; over time, the model will be trained to monitor increasing levels of aggression and anger to improve accuracy, says Dr. Craswell.
A camera or ‘sensor’, as Dr. Craswell describes it, detects agitation based on specific behaviours and facial movements and then alerts clinicians. The software running on these cameras does not record or store data.
Experts have been employed to rate levels of agitation using a human scale before giving those to the algorithm to determine indications of physical stress.
“Along with a whole lot of coding that my computer scientist does with me, we’ve [built] a working computer interface that I can sit in front of and make faces at. It will tell me where my agitation sits in our model”, she says.
When training the model, the team mapped emotions from a ‘neutral’ baseline through to extreme agitation that could lead to workplace violence. These detections can then be shared with caregivers. The team also discovered that neutral emotions can be positive, an insight Craswell says is equally important with complex care needs.
“This is when they are at their best and this might be an appropriate time to further develop your relationship with them as a caregiver.”
Privacy and ethics will determine whether it can scale
A key issue, she says, is onboard processing (procedures used to introduce patients to the system) with patient privacy critical to the success of the solution.
“We’re not doing CCTV, we’re not recording but to do onboard processing we have to minimise the model size, so these are the limitations that we’re working within. I think that with technology advancements and the speed by which change happens, that model size won’t be as critical as we think and we’ve got some ideas on how to manage it now.
“It might turn out that our model’s a bit clunky at the start but can be refined in the roll out. The health productivity lab at USC where we’re testing the system, has been fantastic as a safe space to do the development that needs to happen before we can get ethics approval to do anything in the health service space”, she says.
Interest across the hospital network
Healthcare providers across the network including Sunshine Coast University Hospital, Nambour General Hospital, Gympie Hospital, Caloundra Health Service and Maleny Soldiers Memorial Hospital are interested in the technology, says Dr. Craswell.
These facilities, which also run outpatient clinics for oral health, minor illnesses and injury, see potential applications in monitoring patients experiencing pain, anxiety and heightened alertness.
Workplace violence can also be an issue in these settings, she says.
“There are a lot of areas where they [hospitals] are looking at a machine vision solution adapted to their clinical needs and they’re really interested in using it to help clinicians work at the top of their scope.”
“If clinicians are spending a whole lot of time watching waiting rooms, or monitoring people for changes, they’re not doing the job they’ve been trained to do.”
“They see as ambient identification [tech] similar to what a digital scribe does, sitting in the background and giving you a tap on the shoulder [saying], ‘Hey you might not have noticed but this person is getting agitated.’ The co-design of how that alerting system [will operate] is yet to come in the proof of concept.”
Dr. Craswell expects a working model of this prototype to be live in a clinical setting by the end of the year.
She says the delay is due to ethical considerations.
The informed consent process is very detailed and scrutinised across ethics committees with patients and their families being well information about how the technology is being used.
“We’ve got a couple of levels of this project [that needs to go] through the ethics process and we need to increase the amount of actual patient vision that we use for training to refine the model.
“[This is] the balancing act we’re [undergoing] at the moment with the ethics department but we’re hopeful that’ll all be [done] by the end of the year.”