Desafíos actuales de la Inteligencia Artificial
258 Desafíos actuales de la Inteligencia Artificial The Government nevertheless insisted on using averaging alone to claw back social security payments from a vast number of people. Since 2015, when robodebt was first established, the Government has raised debts totalling over A$1.7 billion against 430,000 people – mostly aged pensioners, disability support pensioners, financial and mental health counselling. In September 2020, a parliamentary committee found that robodebt had “an overwhelming and devastating impact” on many people’s emotional and financial wellbeing and willingness to engage with and trust government services” (Maxwell, 2021, p. 8). The situation was so alarming at the time, and had such serious consequences, that in 2020 a committee of people who had been unfairly harmed by the system was set up, and a website created, where they could report the devastating personal consequences and band together to stop the money being returned, which it was. In the end, the Federal Court of Australia declared the robodebt system illegal, which the government itself admitted, and agreed to refund the amounts wrongly charged (Maxwell, 2021). A very similar situation occurred in the UK, where more than 200,000 people were wrong- ly charged for housing benefit. The automated system flagged several people as fraudulent, but was later shown to be flawed and biased against vulnerable claimants (Maxwell, 2021). On that matter, Eubanks explores the impact of automated algorithms in social policy and includes real-life accounts of people who have suffered digital social exclusion. One of the stories involves a young woman who received medical benefits, food stamps, public trans- port and other services from the US government (Eubanks, 2018). 3. CONCLUSIONS The appeal of AI systems for social protection lies in their potential to work on a larger scale, as the Swedish experience shows - speeding up eligibility checks allows more assess- ments to be carried out in a short period of time and reduces costs. However, this tactic has also been controversial, as illustrated by the different examples from different countries. The case of Australia’s Robodebt system highlights the danger of increasingly automat- ing social protection through AI. A system that relied on average income as a measure of eli- gibility, which ended up being an inaccurate metric. Robodebt has led to serious financial and emotional consequences, with more than A$1.7 billion in debt collected from 430,000 people since its introduction in 2015.The 2020 Parliamentary Inquiry found that for a large number of respondents, it had systemically damaged their health and trust in government services. Ultimately, Robodebt was declared unlawful by the Federal Court of Australia and all money wrongly taken from individuals had to be repaid to those decision points with interest. The case illustrates a fundamental problem with the types of databases that digital technology can access, and how such AI-powered systems can cause distortions if they’re built on dubious assumptions, especially when it comes to marginalised people. In the UK, the Department for Work and Pensions has come under fire for its benefit fraud detection algorithm, which was found to unfairly target disabled people and certain
Made with FlippingBook
RkJQdWJsaXNoZXIy NTEwODM=