Mark Aitken, a registered nurse for 39 years who spent 16 years in aged care roles including assessing elderly people for support and funding, said he quit his job in regional Victoria just four months into using the tool.
[âŠ]
âEight times out of 10, the outcome was different to one that I would have recommended, or my colleagues would have recommended,â Aitken said.
It follows previous controversies over automated decision-making tools being used by the government, including the robodebt welfare scandal, and concerns about algorithm-driven disability funding through the NDIS.
The IAT user guide does not explain how the algorithm weighs risk, need or complexity, and Aitken said this information was never revealed to assessors.
When he asked at a government seminar about the evaluation framework, including what data was being collected, how accuracy would be assessed, and whether results would be publicly reported, he said he felt âshut downâ.
âI left my job because I didnât want to be part of a system that removed the ultimate decision-making about support from real, experienced people who care,â he said.
âThe government valued the algorithm more than people with skills, intelligence and knowledge.â
He said some assessors began âgamingâ the system, inputting information they knew would generate the level of care the person needed even if that information did not accurately reflect their situation.
âPeople shouldnât have to put in fake information,â Aitken said. âI just started to feel like it was going to be another robodebt, I became very uncomfortable, and just felt the tool wasnât ethical.â