This article presents a detailed case study of “RoboDebt” in Australia and examines the political rationalities that underpin automated welfare surveillance systems. First, it is argued that neoliberal political rationalities shape the bureaucratic strategies enacted by agencies established to administer neoliberal welfare policy. Second, it is shown that neoliberal political rationalities influence the design and deployment of new surveillance technologies, and therefore they are embedded with and within politics too. Third, it is argued that the political architecture of the welfare state and associated use of information communication technologies has consequences for social justice. This article demonstrates that information communication technologies are implicated in the neoliberal governance of poverty but are not responsible for it. The article concludes by reflecting on the relevance of critical qualitative inquiry as one possible political intervention to advance social (and data) justice agendas.


This article presents a detailed case study of the “Better Management of the Social Welfare System,”1 a form of automated welfare surveillance in Australia. It examines the political rationalities that underpin the system as a technology of governance with consideration of the nontechnological dimensions and sociopolitical contexts. That is, it decenters technology with respect to its “subordinate place . . . in a larger political problem: the architecture of the welfare state” (Peña Gangadharan and Niklas 2019). This is done through the frames of governmentality (Foucault 2009; Miller and Rose 1990; Rose and Miller 1992; Henman 2006) and technological politics (Winner 1980). Examination of the politics of automated welfare surveillance systems demonstrates how information communication technologies (ICTs) are implicated in neoliberal governance of poverty but are not responsible for it. It underlines the importance of the politics of data systems and highlights the value of using governmentality as a conceptual framework for analyzing government uses of data and information systems in neoliberal contexts. The argument unfolds across four steps: First, it is argued that neoliberal welfare policies shape the bureaucratic strategies enacted by agencies that are established to administer them. Second, neoliberal political rationalities also influence the design and deployment of new surveillance technologies, and therefore they are embedded with and within politics too (Winner 1980). Third, it is argued that the political architecture of the welfare state and the associated use of ICT has consequences for social justice. Finally, the article concludes by reflecting on the relevance of critical qualitative inquiry as one political intervention to advance social justice agendas (Denzin 2017, 2018; Flick 2017).

Conceptual framing: Power, politics, and decentering technology

The critical frame of governmentality is often used to explain how surveillance is deployed as a powerful technique to manage populations, and involves analysis of the relationships between the political rationalities of governance and technologies of governance. Following Miller and Rose (1990) and Rose and Miller (1992), Henman (2006, p. 24–25) draws a distinction between “‘political rationalities’ [which] are the discursive, constructive and justificatory elements of government, whereas ‘technologies of government’ are the means with which such discourses are translated into action or enacted.” Also inspired by Foucauldian conceptualizations of power (as per Foucault 2009), Bucher (2018, p. 37) argues that “algorithms do not simply have power in the possessive sense; they constitute ‘technologies of governance.’”

A complementary lens can be found in Winner’s (1980) influential theory of technological politics. Winner (1980, p. 122) argues that “technical systems . . . are deeply interwoven in the conditions of modern politics.” Winner (1980) illustrates this argument with the example of how the heights of bridges in New York were designed to prevent individuals of low socioeconomic status, who tend to use public transport, from accessing public spaces as buses could not travel under the bridges (Winner 1980). The “technology” highlights class conflict and how politics are imbued within the “monumental structures of concrete and steel [that] embody a systematic social inequality” (Winner 1980, 125). Winner’s (1980, p. 121) approach means examining technologies “not only for their contributions of efficiency and productivity . . . but also for the ways in which they can embody specific forms of power and authority” (emphasis added). This approach to understanding technological politics can be related to Genosko and Thompson’s (2002) analysis of “pre-electronic technologies” of administrative surveillance to restrict alcohol consumption in Canada. They argue that “beyond technology is, then, the power that accrues to those and their cohorts who use categorisation of such personal information for varied and politically motivated purposes of social control” (Genosko and Thompson 2002, 2, emphases added).

Despite the central role that digital technologies now play in social security, and for all that “databasing” and automation allows, it may overshadow nontechnological dimensions of welfare surveillance. This can be related to recent calls made by Peña Gangadharan and Niklas (2019) to decenter technology in discourse on discrimination and inequality, which are socially entrenched and independent of technology. They argue that “while automated systems factor into this architecture . . . technology is a secondary concern to the larger problem that prompted automated welfare’s implementation (i.e. austerity)” (Peña Gangadharan and Niklas 2019, 892). Along these lines, Henman (2017, p. 1) calls for “a critical analysis of algorithms that interrogates both the socio-technical design and development of algorithms, as well as the socio-organisational location of their operation.” This type of analysis requires a focus on the way surveillance practices “exert influence and reproduce power relations through technological and non-technological means alike” (Monahan 2017, 192). For example, Eubanks contrasts different dimensions of welfare surveillance, arguing that:

Conceptually, computerised information systems in wide use by departments of social services are not very different from invasive home visits by caseworkers, extensive (though narrowly focused) case records . . . Politically, the purposes of surveying the poor have largely stayed constant for three centuries: containment of alleged social contagion, evaluation of moral suitability for inclusion in public life and its benefits, and suppression of working people’s resistance and collective power. (Eubanks 2006, 90, emphasis in original) 

Monahan (2017, p. 201) argues surveillance is a way to regulate “conditions of abjection” “at the margins” of society, and that these conditions “are materialised by neoliberal dynamics . . . and imperatives for data collection and technological automation.” Surveillance, as well as any critique of it, cannot be separated from the contexts in which it operates. Similar arguments have been made by Henman (2017, p. 1) in advocating for the recognition of the “social embeddedness” of algorithms. Significantly, Henman (2017, p. 5) asserts that this type of “analysis needs to consider the purposes for the algorithm, and reasons it is created and what its use is seeking to achieve beyond its technical performance” (emphasis added). Henman (2017, p. 1) makes a case for analyzing “performance with respect to its purpose” and “role in politics” (emphasis added). What appear to be “rational and ideologically ‘neutral’ systems of surveillance” enact political ideologies (Dee 2013, 282--83, emphasis added).

Aims and method

This research aimed to examine the political rationalities supporting programs of automated welfare surveillance, and employed a qualitative case study design to do so. It has been argued that RoboDebt was a “politically-driven process(Henman 2017, 1, emphasis added) and therefore was an ideal case to address the research aims. It does so via examination of the official aims of the program (i.e., the discursive, constructive, and justificatory elements of government) and consideration of its design and operation (i.e., the ways discourses are enacted as “technologies of governance”) within a local sociopolitical context. This research is concerned with advancing social (and data) justice through critical qualitative inquiry. Critical qualitative inquiry is an approach to conducting research in “the pursuit of social justice within a transformative paradigm [that] challenges prevailing forms of inequality, poverty, human oppression, and injustice” (Denzin 2017, 8). Flick (2017) outlines an approach to critical qualitative inquiry commencing with the identification of vulnerable groups and the social problems they are confronted with, commencing with analysis of the way institutions deal with such problems, and reflecting upon the usefulness and relevance of the research in addressing such problems. Within the field of critical inquiry, the researcher is an advocate “challenged to confront the facts of injustice, to make the injustices of history visible, and hence open to change and transformation” (Denzin 2018, 97). This intervention consists of examination of the justificatory elements of the program and its operation as a “technology of governance” in order to better understand, and possibly open a space to challenge, the rationalities and politics supporting automated systems of social control.

The research drew empirical data from policy documents and related documents of the program, selected on the basis of their relevance to the case study and research aims. These documents were drafted from official perspectives (i.e., Senate committee, Senate estimates, government responses) and therefore represent a certain worldview—that is, official and political discourse concerning the program. This data included reports of various inquiries into the program, such as the Senate Community Affairs References Committee (2017) inquiry into “the design, scope, cost-benefit analysis, contracts awarded, and implementation of the system.” Government budget and Senate estimates documents were consulted, including the transcripts of the Senate estimates hearings and the government’s responses to questions taken on notice. This data provided information about the fiscal context and stated rationale for the program and was contextualized with reference to government or ministerial announcements and media statements. Media reporting around key aspects of the program was accessed online, particularly when unauthorized disclosures were made about certain aspects of the program because these perspectives were unable to be obtained from official sources. These reports, such as the statements made by anonymous whistleblowers, represented the perspective of those working within the system and were drawn upon to present a more complete (yet always partial) description of the various aspects of the program. Relevant statements were incorporated (for example, bureaucratic strategies of staff performance management) to provide a more complete depiction of the program and supplement the official policy documents. All data that has been explicitly referred to in the case study are cited within the text so that the reader may trace and assess the original source of information. The data were analyzed through discourse analysis to examine the political rationalities underpinning the design and operation of the program and to present a comprehensive and critical assessment of it. This process involved analyzing the official discourse concerning the program and the stated rationale supporting its introduction and operation, within social and political context (on discourse analysis, see Johnstone 2018). These discourses include the rationalities of RoboDebt, the fiscal context and the justificatory elements supporting the program’s introduction, and the bureaucratic strategies Services Australia deployed to expand and defend the program as a way of regulating “neoliberal deviants.”

The “RoboDebt” program

In July 2016, under a conservative Liberal government, the then Department of Human Services (DHS, now Services Australia) launched an automated debt raising program to identify “overpayment” of welfare benefits. If the algorithm2 detected discrepancies between income reported to Centrelink3 and that registered by the Australian Taxation Office (ATO), a debt recovery letter was sent requiring the welfare recipient to prove that they did not have a debt, with unconfirmed or unpaid debts subject to recovery processes such as withholding social security payments, garnishing tax returns, or outsourcing to debt collectors (Knaus 2017; Commonwealth Ombudsman 2017; Order of Justice Davies in Deanna Amato -v- The Commonwealth of Australia 2019). According to Centrelink (n.d.), the “program helps protect Australia’s welfare system” and is designed to “help you to avoid owing money” (emphasis added).4 Figures from April 2019 reveal that between July 1, 2016, and March 31, 2019, just over half a million debts were raised via the automated system (“Senate Community Affairs Legislation Committee. Estimates” 2019, 127).5

To date, two inquiries have been held into the program: one referred by the Senate Community Affairs References Committee (2017) and one initiated by an own-motion investigation by the Commonwealth Ombudsman (2017) in response to complaints received. The Senate inquiry recommended suspension of the system until issues of procedural fairness were addressed (Community Affairs References Committee 2017). This was, in part, because the program engaged in the mismatching of inaccurate data, resulting in erroneous debts: it was estimated that 20 to 40 percent of the debts were false (Community Affairs References Committee 2017). The erroneous debts caused significant “data harm” including vulnerable and already financially disadvantaged individuals being issued with debts that they did not owe; the debts being outsourced to private debt collectors, with a 10 percent debt recovery fee added to the debt; and the garnishing of welfare payments and tax returns to recoup the debts (on data harm, see Redden and Brand 2017; for stories documenting the individual impacts of RoboDebt, see #NotMyDebt;6 see also Community Affairs References Committee 2017 for further details of the individual impacts and harms of the program). The government response to the inquiry stated that “with $170 billion of welfare payments made in 2015–16 the Australian Government have [sic] a responsibility to make sure people are paid the right amount” and that “each person should receive exactly what they are entitled to, no more and no less(Australian Government 2017, 5, emphasis added). It is telling that the program is explicitly designed to identify overpayments (“no more”), rather than identifying underpayments (“no less”). Ultimately, the government rejected the recommendations of the inquiry, arguing that there is “no evidence to support the recommendation to put on hold the online system” (Australian Government 2017, 5, 8). Two years later, in July 2019, the program was again referred to a further Senate Community Affairs References Committee Inquiry, with a report pending (Siewert 2019).

In late 2019, the government conceded in court that the automated approach to debt recovery was illegal. The Federal Court of Australia declared that the automated debt calculated in the case in question was based on erroneous assumptions about the averaged fortnightly income, that there was no material to support the assumptions made by the algorithm, and that there was probative material (i.e., the fortnightly earnings reported) contrary to the averaged amount calculated by the algorithm. Debts that have been identified via the income averaging and data matching algorithm will have to be reassessed with the government to repay “debts” it recovered. However, it is perhaps too early to ascertain what the exact consequence of this decision will be in terms of welfare surveillance in Australia. It may result in enhanced data collection and matching, such as through expanded financial monitoring, which was very recently announced as a new initiative to deliver AUD 2.1 billion in “savings” over the next four years after the RoboDebt program was found to be illegal (Brown 2020). In light of the above, this article presents a detailed case study of the political rationalities that underpinned RoboDebt as a program of welfare surveillance in Australia.

Neoliberal Welfare Policy and the Rationalities of RoboDebt

The “worldwide spread of capitalist economic relations” has partly contributed to a remarkable neoliberal convergence in the modes of public administration in “Western” countries over the previous two to three decades (see, e.g., Bell 1997, 346; Khoury 2015; Spies-Butcher 2014; Harvey 2007). Neoliberalism is an “ideology and a set of practices” concerned with rationality, efficiency, and productivity, so the public sector is run like a business, preferencing, above all, the free market (Khoury 2015, 25). The main neoliberal practices involve deregulation, marketization, and privatization (Khoury 2015). The consequences of neoliberal economic rationalism often involve cost cutting in services and their delivery under programs of austerity. These operate to “reduce government spending while simultaneously facilitating government withdrawal from key responsibilities” (Khoury 2015, 26, 28). There is an emphasis on individual responsibility, which plays out perhaps most prominently in welfare, as individuals are responsible for their own prosperity and, conversely, their own poverty. Aligned with neoliberalism, a reordering of welfare policies and practices has involved three main interrelated approaches: (1) the reduction of welfare rates and harsher eligibility requirements, (2) increased surveillance of welfare recipients, and (3) connecting welfare to work (Maki 2011). There is a “focus on bureaucratic, measurable, rational-technocratic procedures and interventions” to manage populations (Dee 2013, 272).

Australia has embraced neoliberal ideals across government especially in welfare (Henman and Marston 2008). In Australia, neoliberal policy and practices have been extensively implemented, which has increased social and economic inequality resulting in “the demise of social justice” (Khoury 2015, 25). Australia has deployed numerous digital administration projects, many of which have been subject to intense criticism (see, e.g., Galloway 2017). This has occurred in concert with, or perhaps because of, outsourcing (see, e.g., Pearce 2017). Consistent with neoliberal trends toward bureaucratic centralization, the administration of welfare in Australia occurs via a “mega department” known as Services Australia, formerly the Department of Human Services (DHS) (Halligan 2015, 1002). Services Australia, as a “liberal bureaucracy” (Giauque 2003), evolved following mergers of social service agencies to increase efficiency via integrated ICT systems7(Halligan 2015).

In Australia, welfare policy is highly politicized, and presently there is a public campaign to “raise the rate” of welfare payments, which have not increased in real terms for more than twenty-five years (Raise the Rate 2019).8 The current rate of welfare payments is well below the poverty line,9 yet calls to “raise the rate” by a coalition of social and community organizations have been emphatically rejected by the government (Murphy 2019). Rather, they have been met with proposals made by the government for enhanced technological and biopolitical surveillance, including income management via a cashless welfare card (Elton 2019) and mandatory drug testing (Henriques-Gomes 2019c; Martin 2019) as “solutions” to poverty. Further, since 2015, in combination with the Australian Federal Police, the DHS has operated “Taskforce Integrity,” where “data analysis, and other information, points to a higher risk of non-compliance and suspected welfare fraud” (Australian Federal Police n.d.). The RoboDebt program, as introduced above, has been a key component of the Australian government’s welfare policy to reduce welfare expenditure through detecting illegitimate entitlement and fraud by welfare recipients. The elements presented and discussed below, namely the fiscal context, the justificatory elements supporting the program’s introduction, and the bureaucratic strategies Services Australia deployed to expand and defend the program as a way of relating “neoliberal deviants,” emerged from the analysis of the official discourses.

Fiscal context: Justificatory elements supporting RoboDebt’s introduction

In order to identify the political rationales supporting RoboDebt, it is necessary to consider the fiscal context in which it was implemented as the fiscal context largely provided the “justificatory elements” supporting its introduction (Henman 2006, 24--25). Achieving a budget surplus has been a central issue for successive Liberal governments and was at the time the program was initiated in 2016.10 When RoboDebt was first implemented in 2016, the government forecast that it would save AUD 1.7 billion over five years. A year later, the government indicated that it would achieve AUD 3.7 billion in savings over four years, and in 2019 it was estimated that from 2019 to 2023 the government would achieve a further AUD 4 billion in “savings” (Henriques-Gomes 2019c).11

It is worth considering these forecasts in the context of the amount of actual “savings” that were recovered (and will likely have to be repaid given the illegality of RoboDebt). In a Senate Estimates hearing in April 2019, the government stated that it had raised approximately AUD 1.5 billion and that this figure was increasing at a rate of approximately AUD 10–12 million per week, equating to approximately six thousand debts “finalised” (“Senate Community Affairs Legislation Committee. Estimates” 2019, 129).12 As of December 31, 2018, the government had recouped approximately one-third of this amount, equating to approximately half a billion dollars.13 There were questions about whether the government would achieve its forecasted “savings” (Commonwealth Ombudsman 2017). Moreover, in June 2018 it was reported that the DHS had spent AUD 375 million on the program (Barbaschow 2019), and later, in February 2019, it was reported to have increased to AUD 400 million by the end of 2018 (Henriques-Gomes 2019a).14 Since the RoboDebt program was found to be illegal, debts identified via income averaging will have to be reassessed and repaid. It is not clear exactly how much the government will have to repay; it is expected to be in the hundreds of millions (Karp 2019a). This outcome casts serious doubt on whether the automated program will meet savings projections.

Nevertheless, government representatives consistently argue that the program has worked very well. For example, the then minister for social services, Christian Porter, has stated that “over the course of the next four years, it’s going to recoup $4 billion (AUD) of taxpayer’s money . . . The question is not whether or not the system is working—it absolutely is working"(cited in Barbaschow 2017). This statement can be contrasted with views that it was”extraordinary and disturbing that the department could describe the project as having gone ‘very well’ despite the well documented hardship and distress it caused countless Australians" (Senate Finance and Public Administration Committee 2018, 106). Even when the program was found to be illegal, Minister Stuart Robert argued that only a “small cohort” of debts would be impacted, that there is “no change to the construct of the onus of proof,” and that “we use other proof points as well” (cited in Karp 2019b). The government has doubled down on the argument that the program is a success and will result in savings via reduced welfare expenditure. When questions are raised about the ability of the program to meet fiscal objectives, or indeed even its legality, these are not met with calls to abolish the system but with strategies to expand it. These strategies enact neoliberal political rationalities and operate as “technologies of government” via bureaucratic plans, performance management techniques, and discursive tactics to defend the program.

Bureaucratic strategies to expand and defend RoboDebt

Documents leaked to The Guardian in 2019 (Henriques-Gomes 2019b) show that proposals to expand the program15 were designed to target (more) vulnerable groups. The documents show Services Australia’s observation that the “estimated savings over the forward estimates cannot be achieved without undertaking sensitive cohort reviews” and that “the department would need to carry out an additional 1.6m income reviews over the next three years to reach the promised savings, including 350,000 debt-recovery reviews among ‘sensitive’ or vulnerable groups.” This would have, in effect, captured an additional 240,000 pensioners, 40,000 people in remote areas (50 percent of whom are Indigenous),16 and 70,000 individuals considered vulnerable (e.g., those who have a known vulnerability indicator such as domestic violence survivors or those with a disability or a mental disorder).

Performance management attempts to increase staff efficiency in recovering debts were also leaked to the media. Former Centrelink compliance officers turned whistleblowers released information concerning “debt finalisation” targets (Smee 2019). These anonymous whistleblowers have stated (as cited in Kearsley and McPherson 2019) that the instructions they received “were to do everything possible to accelerate the process in debt raising and satisfy the targets that were being set,” and in the event that targets were not met, “you were put on performance review for four weeks, and then escorted out of the building if you didn’t meet your finalisation requirements.” One former compliance officer reflected that “it was a horrible job. The department is just a debt-raising machine, and that’s all they care about,” while another commented that “it was very inhumane. It was all about the money, and we have to get those finalisations.” The political rationale supporting such approaches to performance management is clear, as one anonymous compliance officer stated:

“It was all about the numbers. They would constantly say we are trying to adhere to the estimates that were provided to the Senate estimates hearing in relation to how many finalisations would be completed within a given period for the sake of trying to recoup revenue.”

These comments speak to a deeper reflection on the experiences of humans working within the mostly but not entirely automated system, and on their role in raising revenue (rather than administering “welfare”). Systems of surveillance “change the character of occupational knowledge” (Genosko and Thompson 2002, 26) and regulate those administering the system, as well as those it is targeted against. Performance management is a “technology of governance” as it “define[s] circuits of power that transform and link the governance of welfare agencies, managers, workers and clients” (Henman 2006, 29). These observations can also be related to research on ICT and social security that demonstrates how ICTs are used to control rather than empower those who work in welfare in addition to welfare recipients (Henman and Adler 2003).

There have been successive challenges to RoboDebt, and in defending these, Services Australia revealed its approach to defending the system. One challenge was made by civil society activist Justin Warren17 to have Services Australia release information about RoboDebt (including but not limited to documents concerning risk governance and oversight of the data matching) under the Freedom of Information Act (1982) (Cth). In response to requests first made in early 2017, Services Australia refused to release documents, claiming that publication may pose security risks (Right to Know n.d.), or as one media article stated: “Human Services claimed people wouldn’t pay debts if informed about its IT systems” (Stilgherrian 2019). In June 2019 the Australian Information Commissioner ruled Services Australia must release these documents (Australian Information Commissioner 2019), and Services Australia appealed this decision to the Administrative Appeals Tribunal. The department and the government have consistently withheld information from the public. For example, the Senate ordered the government to produce legal advice that informed their concession that the program was illegal, and the responsible minister refused to do so (Ireland 2019). In the midyear budget update, the government did not provide information on the number or corresponding amount of debts that will need to be reviewed and repaid.

Automated surveillance to regulate “neoliberal deviants”

The political rationalities of neoliberalism, specifically reducing spending on “neoliberal deviants” (Maki 2011), influence the design and deployment of new surveillance techniques that are therefore embedded with and within politics too. The political rationalities of such inherently political technologies are “almost invariably linked to specific ways of organising power and authority” that reveal how technology contributes to a broader “pattern imposed independently by a governing body, ruling class, or some other social or cultural institution to further its own purposes” (Winner 1980, 131). Indeed, “the politics of surveillance necessarily include the dynamics of power and domination” (Gilliom 2001, 2--3). Surveillance is not applied equally across the population, and some cohorts or groups are surveilled more, and in different ways, than others (Henman and Marston 2008; see generally Gilliom 2001). Maki (2011, p. 52) examines shifts in surveillance within neoliberal contexts, specifically in Canada, arguing that “welfare recipients, as neoliberal deviants, are among the most highly surveilled and regulated in Western societies.” Surveillance operates to manage “neoliberal deviants,” so they can transition into productive workers and consumers of a capitalist market society. Welfare recipients are doubly deviant because they are drawing resources from the state, and they are not contributing to it either (i.e., as taxpayers or as consumers). Intense and invasive welfare surveillance is justified as welfare recipients are “fraudulent and untrustworthy” (2011, p. 54). Similarly, Monahan (2017, p. 201) notes that surveillance in welfare is fused “with a culture of suspicion of welfare fraud.” In Australia, Wilcock (2014, p. 190) demonstrates that the construction of a “welfare cheat” is “an image that conforms neatly to the pervasive trope of ‘personal responsibility’ for both crime and poverty” within neoliberalism. This discursive imagery provides the conditions necessary for the deployment of new ICTs and techniques for surveillance to control “neoliberal deviants.”

There is a mutually reinforcing relationship between neoliberal ideologies and the development of ICT and “databasing” to achieve them. ICTs are assumed a sine qua non for neoliberal ideals of efficiency, productivity, rationality, and risk-based managerialism. Neoliberal governance invariably extends the development and application of ICT toward these ends (see, e.g., Henman and Adler 2003). ICTs are inherently political (Winner 1980), and their political nature is most striking in welfare surveillance. Here, Winner’s arguments concerning the politics of artifacts (i.e., his famous bridge example, as introduced above) can be explicitly connected to Yeung’s (2018, p. 507) taxonomy of algorithmic regulation and the use of algorithms by government in an attempt to “manage risk or alter behaviour to achieve some pre-specified goal.” Technology, specifically automation and databases, has significantly “expanded the state’s power to regulate, monitor and control poor people” (Maki 2011, 60) and “simply represents the latest episode in the ongoing relationships of power and domination along the well-established lines of race, class and gender” (Henman and Marston 2008, 199). Databases have become the main way to weaponize personal information (Eubanks 2006, 2018). In Australia, RoboDebt is an exemplar of how ICTs under neoliberalism can be used against already marginalized populations to further control and marginalize them.

RoboDebt used an algorithm to detect “overpayment” of social security benefits to identify unscrupulous “welfare cheats” to recoup welfare spending toward a budget surplus. It is not the algorithm itself that is of central concern but rather the way it was designed and operated toward achieving savings on social security that (supposedly) contributed toward a budget surplus, and also self-regulation in reporting by welfare recipients as they would be caught if they attempted to deceive the system. The system, by design, operated to identify overpayments rather than underpayments, despite the government’s own rhetoric of legitimate entitlement. “Savings” occurred through an ex post facto recovery process, rather than an a priori accuracy measure. The system is designed to punitively target a certain part of the population with debts (framed as “savings”) rather than ensuring that payments are accurate or attempting to alleviate poverty.18

Social (and Data) Justice: Intervention through Critical Qualitative Inquiry

The political architecture of the welfare system under neoliberalism and the use of ICTs to support it has significant impacts for social and data justice. RoboDebt forces us to question what it means for the system to be “fair” given the rationalities that underpin the system and that gave rise to its design and deployment. The system aims to recoup welfare spending through automated surveillance that is irrefutably unfair because it identified debts that do not exist and that the government is not lawfully entitled to claim. This is “unjust enrichment” and will form the basis of a forthcoming class action (Carney 2019). RoboDebt also attempted to recoup debts by reversing the onus of proof, so individuals have to prove they do not have a debt, which is procedurally unfair too. However, there is a deeper notion of social justice that relates to equity and equality, especially in terms of need. Social justice considerations are significant given that the majority (90 percent) of welfare recipients in Australia live below the poverty line (Davidson et al. 2018). RoboDebt was targeted at vulnerable and marginalized members of the community, and the department proposed expanding the program to target those who are even more vulnerable (i.e., specifically those with known vulnerability indicators) to achieve targets. This helps to explain, or at least point toward, class inequality and power structures, facilitated by and through technology, in concert with and within the scaling back of the welfare state. RoboDebt was designed to target vulnerable and already marginalized people with (false) debts, rather than attempting to alleviate or address poverty and inequality in society. Therefore, RoboDebt requires us to be attentive to the way in which automated surveillance technologies contribute to and reinforce a social structuring of powerful and powerless classes where individuals of low socioeconomic status are subject to automated surveillance as part of conservative neoliberal politics that aims to deny or at least reduce welfare payments as much as possible, and to as many people as possible.

This recognition opens up a space to interrogate the “choices that can affect the relative distribution of power,” leading to what Winner (1980, p. 127) argues is the “crucial decision” of whether “to develop and adopt the thing or not” (Winner 1980, 127; on this point, see also Powles and Nissenbaum 2018). That is, whether the automated technology should be abandoned altogether, if the system can be improved, or whether a balance can be struck between “fairness” and automated welfare surveillance. It could be argued that algorithms can be made “fairer” through opening up their “black boxes” and inspecting how they create errors or unduly discriminate with the view to tinker and optimize. Yet such “algorithmic fetishism” (Monahan 2018) or “better-designed welfare automation (e.g., fair algorithms) may not save a shrinking welfare state or be the best instrument for transforming cultural understanding of the poor” (Peña Gangadharan and Niklas 2019, 896). Decentering technology shows that “algorithmic fairness” can never be achieved in a fundamentally unfair and unequal society. Technology does not, and will not, solve social problems like poverty, which is entrenched via a neoliberal “welfare” system that is arguably meant to alleviate it. Welfare is antithetical to neoliberal economic progress, a legacy of the “welfare state,” where conservative governments now limit, restrict, and recoup payments, activities made even more possible by automated technologies. Technology facilitates and enables the punitive, unfair, and unjust policies of the neoliberal state to be executed at great(er) scale and speed. In this way, technology embedded with, and within, neoliberal politics is the antithesis of social justice, but it is not technology that is the real menace; rather, it is the neoliberal welfare state itself.

One avenue toward challenging the neoliberal welfare state is through critical qualitative inquiry and academic research as advocacy that aims to expose and respond to “inequality, poverty, human oppression, and injustice” (Denzin 2017, 8). Academic research as advocacy is one possible intervention in understanding the ways powerful institutions deal with vulnerable groups, and the strategies deployed to respond to them, that in turn operate as “technologies of government.” In the present case, this involved examination of the justificatory elements supporting the RoboDebt program and its operation as a “technology of governance” to expose, and possibly open a space to challenge, the rationalities and politics supporting it as a system of automated social control. This approach shares affinity with the DATACTIVE movement that engages in research “at the intersection of ‘traditional’ research and the set of critical and/or activist practices that deal or ‘act upon’ datafication” and that seeks to interrogate the politics of data (see Kazansky et al. 2019, 245). It also connects to the data justice movement through examination of “the role of data collection in (new) forms of governance that shape society in line with particular political and economic agendas” (Dencik, Hintz, and Cable 2019, 180) and “situates concerns with data-driven processes within a broader framework of social justice” (Dencik, Hintz, and Cable 2019, 182). Critical qualitative inquiry aligned with social (and data) justice is one possible intervention in analyzing, understanding, and challenging the neoliberal welfare state and the role of technology within it. In doing so, this research is useful in rendering “the injustices of history visible, and hence open to change and transformation” (Denzin 2018, 97; Flick 2017). It enables an opportunity to “scrutinise the interests and power relations at play in ‘datafied’ societies that enfranchise some and disenfranchise others, highlighting also forms of exclusion and discrimination” (Dencik, Hintz, and Cable 2019, 181).


This article has examined the neoliberal political rationalities that support welfare surveillance systems. Neoliberal political rationalities underpin bureaucratic strategies that are enacted as “technologies of governance,” and also the adoption and deployment of new surveillance technologies that are embedded with and within politics too. The RoboDebt case study opens up an opportunity to attend to the ways new technologies reinforce a social structuring of powerful and powerless classes, where individuals of low socioeconomic status are subject to automated surveillance as part of neoliberal politics that aims to deny, or at least reduce, welfare payments as much as possible, and to as many people as possible. The political architecture of the welfare state and the associated use of ICT have consequences for social justice, as inequality is ingrained in a neoliberal “welfare” system that is arguably meant to alleviate it. Therefore, to advance social justice agendas, the architecture of the neoliberal welfare state needs to be dismantled, and one possible avenue for political intervention is through critical qualitative inquiry. Through such methods, this case study has examined the political dimensions of welfare surveillance, demonstrating how ICTs are implicated in neoliberal governance of poverty but are not responsible for it.


The author would like to thank Dr. Angela Daly, Dr. Ian Warren, Mr. Angus Murray, Ms. Lyndsey Jackson (of #NotMyDebt), and the two anonymous reviewers for comments on previous iterations of this work. She would like to acknowledge the excellent research assistance provided by Dr. Kayleigh Murphy, and QUT for financing Dr. Murphy’s contribution under the author’s Vice-Chancellor’s Research Fellowship.

Author Biography

Dr. Monique Mann is a Senior Lecturer in Criminology and member of the Alfred Deakin Institute for Citizenship and Globalisation at Deakin University. Dr Mann is an Adjunct Researcher with the Law, Science, Technology and Society Research Centre at Vrije Universiteit Brussel. Mann's research expertise concerns three main interrelated lines of inquiry: (1) new technology for policing and surveillance, (2) human rights and social justice, and (3) governance and regulation.



Also known as the “Online Compliance Intervention” or “RoboDebt.”


The algorithm operated as follows: (1) Dividing the total income specified in ATO information by the number of days in the employment period specified in the ATO information to produce a daily income figure; (2) multiplying the average daily income figure by 14 to produce an apportioned fortnightly income figure; (3) substituting the amount of income declared for each fortnight as actual income for the relevant fortnights with the apportioned fortnightly income figure; (4) calculating welfare entitlement for the relevant fortnight on the basis that the apportioned fortnightly income represented the actual income for the fortnight; (5) for each fortnight calculating the difference between the welfare entitlement on the basis of the apportioned fortnightly income and the amount that was paid on the basis of the income reported was the actual income; (6) aggregating the differences for all fortnights to produce an alleged debt (see Order of Justice Davies 2019).


Centrelink is the main welfare agency within Services Australia.


There is some doublespeak in describing a debt recovery program, designed specifically to identify and raise debts, as a strategy to prevent people from owing money!


Of these debts, following review, 34,458 debts have been reduced or waived, including 27,361 debts waived entirely and 15,721 debts reduced to an amount above zero. There will be further reviews of these debts in response to the Federal Court of Australia decision (discussed further below).


#NotMyDebt houses stories of the individual impacts of the RoboDebt program:


This is consistent with shifts from New Public Management toward “digital-era” governance (see, e.g., Dunleavy et al. 2005). Further, it is interesting to note that until the resignation of the minister for human services, Michael Keenan, this portfolio was a conjunct appointment as minister assisting the prime minister for digital transformation. Following Keenan’s resignation, currently, the minister for human services is Stuart Robert, who has held this position since May 2019.


The “Raise the Rate” campaign is being run by a coalition of civil society organizations within the social and community sector including the State and Territory Councils of Social Services, the Australian Association of Social Workers, Children and Young People with Disability Australia, Combined Pensioners and Superannuants Association, Community Mental Health Australia, Consumers Health Forum of Australia, Council of Single Mothers and Their Children, Food Bank, Jobs Australia, Mission Australia, National Ethnic Disability Alliance, St. Vincent de Paul Society, Financial Counselling Australia, Public Health Association Australia, Centre for Excellence in Child and Family Welfare, Social Ventures Australia, and Youth Affairs Council Victoria, among others. For complete details on the “Raise the Rate” campaign, see the campaign website:


Current Newstart payment rates: For a single person with no children $277.85 per week, for a single person with a dependent child $300.55 per week, and for partnered individuals $250.85 each per week (Newstart: How Much You Can Get, n.d.). The Australian Council of Social Services (Davidson et al. 2018) draws a poverty line for a single adult of $433 per week, and for a couple with two children $909 per week. There is a high rate (90 percent) of individuals on Newstart who live below the poverty line (Davidson et al. 2018). The Raise the Rate campaign seeks to increase the Newstart payment by $75 per week, which is still below the poverty line. All amounts are presented in Australian dollars (AUD).


The government, under the leadership of Prime Minister Scott Morrison, later ran a successful reelection campaign on a platform of tax cuts and delivering a budget surplus (Letts 2019).


Note that these estimated savings were made prior to the Australian Federal Court case decision that RoboDebt is illegal.


The most recent (2017–18) Department of Human Services Annual Reports reveal a marked increase in the amounts of debts raised, though not necessarily the volume of debts: in 2015–16, $2.8 billion in debts raised (2,439,431 debts); in 2016–17, $2.8 billion in debts raised (2,384,911 debts); and in 2017–18, $3.2 billion in debts raised (2,493,474 debts raised). These debts have not solely been raised via the RoboDebt system.


Debts that are not recouped, yet that the government still believes it is owed, are included in the budget balance (Henriques-Gomes 2019c).


When the costs of running the program are deducted from the debts that have been recovered, the government has achieved a “profit” of about AUD 100 million. The “blowout” in costs was attributed to redesign of the program in response to criticism, which has required the employment of subcontractors to complete work that it was initially assumed would be automated (Henriques-Gomes 2019c).


A recent article in The Guardian suggests that the Human Services Department could continue to expand the program to create as many as 1.3 million further RoboDebts to meet their target of AUD 2.1 billion in “savings” (see Henriques-Gomes 2019c).


These demographics have the lowest computer and internet use in Australia, which is known as the “digital divide,” and this may impact their ability, and therefore likelihood, of navigating online systems to contest debts (see, e.g., Dane, Mason, and O’Brien-McInally 2013; Curtin 2001).


For the records of various freedom of information requests, see


This is not to advocate for real-time surveillance systems to ensure “accuracy” but rather to reflect upon the way debts are constructed as “savings” that are “recovered,” with the implication that welfare recipients, who cannot be trusted, are in the wrong and must be monitored, a situation that results in a net positive for the state.


Australian Federal Police. n.d. “Taskforce Integrity.” Accessed September 17, 2019.
Australian Government. 2017. “Australian Government Response to the Community Affairs References Committee Report: Design, Scope, Cost-Benefit Analysis, Contracts Awarded and Implementation Associated with the Better Management of the Social Welfare System Initiative.” October 10, 2017.
Australian Information Commissioner. 2019. “Justin Warren and Department of Human Services (No 2) (Freedom of Information) [2019] AICmr 30 (6 June 2019).” June 11, 2019.
Barbaschow, Asha. 2017. “Commonwealth Defends Centrelink Data Matching Bungle as Working ‘Incredibly Well.’” ZDNet. January 7, 2017.
———. 2019. “Human Services Has Spent AU$375m on ‘Robo-Debt.’” ZDNet. February 13, 2019.
Bell, Stephen. 1997. “Globalisation, Neoliberalism and the Transformation of the Australian State.” Australian Journal of Political Science 32 (3): 345–68.
Brown, Greg. 2020. “Scott Morrison’s Welfare Shake-up to Deliver $2bn Saving.” Australian, January 27, 2020.
Bucher, Taina. 2018. If...Then: Algorithmic Power and Politics. Oxford: Oxford University Press.
Carney, Terry. 2019. “Robo-Debt Class Action Could Deliver Justice for Tens of Thousands of Australians Instead of Mere Hundreds.” Conversation. September 17, 2019.
Centrelink. n.d. “Compliance Program.”
Commonwealth Ombudsman. 2017. “Centrelink’s Automated Debt Raising and Recovery System: A Report about the Department of Human Services’ Online Compliance Intervention System for Debt Raising and Recovery.” 2017.
Community Affairs References Committee. 2017. “Senate Inquiry into the Design, Scope, Cost-Benefit Analysis, Contracts Awarded and Implementation Associated with the Better Management of the Social Welfare System Initiative.” Canberra: Commonwealth of Australia.
Curtin, Jennifer. 2001. “A Digital Divide in Rural and Regional Australia?”
Dane, Sharon K., Claire M. Mason, and Beth O’Brien-McInally. 2013. “Household Internet Use in Australia: A Study in Regional Communities.” CSIRO Report: EP1310907.
Davidson, Peter, Peter Saunders, Bruce Bradbury, and Melissa Wong. 2018. “Poverty in Australia, 2018.” ACOSS/UNSW Poverty and Inequality Partnership Report No. 2. Sydney: ACOSS.
Dee, Mike. 2013. “Welfare Surveillance, Income Management and New Paternalism in Australia.” Surveillance & Society 11 (3): 272–86.
Dencik, Lina, Arne Hintz, and Jonathan Cable. 2019. “Towards Data Justice: Bridging Anti-Surveillance and Social Justice Activism.” In Data Politics, edited by Didier Bigo, Engin Isin, and Evelyn Ruppert, 167–86. London: Routledge.
Denzin, Norman. 2017. “Critical Qualitative Inquiry.” Qualitative Inquiry 21 (1): 8–16.
———. 2018. The Qualitative Manifesto: A Call to Arms. New York: Routledge.
Dunleavy, Patrick, Helen Margetts, Simon Bastow, and Jane Tinkler. 2005. “New Public Management Is Dead-Long Live Digital-Era Governance.” Journal of Public Administration Research and Theory 16 (3): 467–94.
Elton, James. 2019. “Scott Morrison Defends Expansion of Cashless Welfare Card.” ABC, September 10, 2019.
Eubanks, Virginia. 2006. “Technologies of Citizenship: Surveillance and Political Learning in the Welfare System.” In Surveillance and Security: Technological Politics and Power in Everyday Life, edited by Torin Monahan, 89–108. London: Routledge.
———. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: St. Martin’s Press.
Flick, Uwe. 2017. “Challenges for a New Critical Qualitative Inquiry: Introduction to the Special Issue.” Qualitative Inquiry 23 (1): 3–7.
Foucault, Michel. 2009. Security, Territory, Population (Lectures at the Collège de France). Houndmills, UK: Palgrave Macmillan.
Galloway, Kate. 2017. “Big Data: A Case Study of Disruption and Government Power.” Alternative Law Journal 42 (2): 89–95.
Genosko, Gary, and Scott Thompson. 2002. “Administrative Surveillance of Alcohol Consumption in Ontario, Canada: Pre Electronic Technologies of Control.” Surveillance & Society 4 (1/2).
Giauque, David. 2003. “New Public Management and Organizational Regulation: The Liberal Bureaucracy.” International Review of Administrative Sciences 69 (4): 567–92.
Gilliom, John. 2001. Overseers of the Poor: Surveillance, Resistance and the Limits of Privacy. Chicago: University of Chicago Press.
Halligan, John. 2015. “Coordination of Welfare through a Large Integrated Organization: The Australian Department of Human Services.” Public Management Review 17 (7): 1002–20.
Harvey, David. 2007. A Brief History of Neoliberalism. Oxford: Oxford University Press.
Henman, Paul. 2006. “Welfare Reform as Governance Reform: The Prospects of a Governmentality Perspective.” In Administering Welfare Reform: Institutional Transformations in Welfare Governance, edited by Paul Henman and Menno Fenger, 19–41. Bristol, UK: Bristol University Press.
———. 2017. “The Computer Says ‘Debt’: Towards A Critical Sociology Of Algorithms And Algorithmic Governance.” Data for Policy 2017: Government by Algorithm?, September.
Henman, Paul, and Michael Adler. 2003. “Information Technology and the Governance of Social Security.” Critical Social Policy 23 (2): 139–64.
Henman, Paul, and Greg Marston. 2008. “The Social Division of Welfare Surveillance.” Journal of Social Policy 37 (2): 187–205.
Henriques-Gomes, Luke. 2019a. “Robodebt Scheme Costs Almost as Much as It Recovers.” Guardian, February 21, 2019.
———. 2019b. “Robodebt Could Target Pensioners and ‘Sensitive’ Groups, Leaked Documents Show.” Guardian, August 23, 2019.
———. 2019c. “Centrelink Could Launch More Than a Million New Robodebts in Next Three Years.” Guardian, September 27, 2019.
Ireland, Judith. 2019. “Government’s Robo-Debt Bill Could Run to ‘Hundreds of Millions’ After Landmark Case.” Sydney Morning Herald, November 28, 2019.
Johnstone, Barbara. 2018. Discourse Analysis. 3rd ed. USA: John Wiley & Sons.
Karp, Paul. 2019a. “RoboDebt: The Federal Court Ruling and What It Means for Targeted Welfare Recipients.” Guardian, November 28, 2019.
———. 2019b. “RoboDebt: Government Abandons Key Part of Debt Recovery Scheme in Major Overhaul.” Guardian, November 29, 2019.
Kazansky, Becky, Guillen Torres, Lonneke van der Velden, Kersti Wissenbach, and Stefania Milan. 2019. “Data for the Social Good: Toward a Data-Activist Research Agenda.” In Good Data, edited by Angela Daly, Kate Devitt, and Monique Mann, 244–59. Amsterdam: Institute of Network Cultures.
Kearsley, Jonathan, and Emily McPherson. 2019. “‘Whiteboard of Shame’: Robo-Debt Compliance Officers ‘Worked to Targets.’” Sydney Morning Herald,” August 9, 2019.
Khoury, Peter. 2015. “Neoliberalism, Auditing, Austerity and the Demise of Social Justice.” Social Alternatives 34 (3): 25–33.
Knaus, Christopher. 2017. “Almost Half of All Centrelink Robo-Debt Cases Sent to Private Debt Collectors.” Guardian, April 11, 2017.
Letts, Stephen. 2019. “Australia’s Economy Has Slowed to a Decade Low but the Budget May Already Be Back to Surplus.” ABC, September 4, 2019.
Maki, Krystle. 2011. “Neoliberal Deviants and Surveillance: Welfare Recipients under the Watchful Eye of Ontario Works.” Surveillance & Society 9 (1/2): 47–63.
Martin, Sarah. 2019. “Scott Morrison Says He Is ‘Puzzled’ by Opposition to Welfare Drug Testing.” Guardian, September 9, 2019.
Miller, Peter, and Nikolas Rose. 1990. “Governing Economic Life.” Economy and Society 19 (1): 1–31.
Monahan, Torin. 2017. “Regulating Belonging: Surveillance, Inequality, and the Cultural Production of Abjection.” Journal of Cultural Economy 10 (2): 191–206.
———. 2018. “Algorithmic Fetishism.” Surveillance & Society 16 (1): 1–5.
Murphy, Katharine. 2019. “‘Unfunded Empathy’: Scott Morrison Pushes Back on Growing Calls to Lift Newstart Rate.” Guardian, July 29, 2019.
Order of Justice Davies in Deanna Amato -v- The Commonwealth of Australia. 2019 Federal Court of Australia, VID611/2019, 27 November 2019.
Pearce, Rohan. 2017. “Government ICT Outsourcing Slammed.” Computer World, October 12, 2017.
Peña Gangadharan, Seeta, and Jędrzej Niklas. 2019. “Decentering Technology in Discourse on Discrimination.” Information, Communication & Society 22 (7): 882–99.
Powles, Julia, and Helen Nissenbaum. 2018. “The Seductive Diversion of ‘Solving’ Bias in Artificial Intelligence.” Medium.Com. December 7, 2018.
Raise the Rate. 2019. September 17, 2019.
Right to Know. n.d. “Business Case and Pilot for Data Matching.” Accessed September 17, 2019.
Rose, Nikolas, and Peter Miller. 1992. “Political Power beyond the State: Problematics of Government.” The British Journal of Sociology 43 (2): 173.
“Senate Community Affairs Legislation Committee. Estimates.” 2019.
Senate Finance and Public Administration Committee. 2018. “Digital Delivery of Government Services.”
Siewert, Rachel. 2019. “Senate Sends Robodebt to Inquiry for the Second Time in Three Years.” Greens, August.
Smee, Ben. 2019. “Blind Centrelink Officer Says He Was Shamed and Sacked for Slow Work.” Guardian, August 18, 2019.
Spies-Butcher, Ben. 2014. “Marketisation and the Dual Welfare State: Neoliberalism and Inequality in Australia.” The Economic and Labour Relations Review 25 (2): 185–201.
Stilgherrian. 2019. “Human Services Claimed People Wouldn’t Pay Debts If Informed about Its IT Systems.” ZDNet. June 17, 2019.
Wilcock, Scarlet. 2014. “Official Discourses of the Australian ‘Welfare Cheat.’” Current Issues in Criminal Justice 26 (2): 177–94.
Winner, Langdon. 1980. “Do Artifacts Have Politics?” Daedalus 109 (1): 121–36.
Yeung, Karen. 2018. “Algorithmic Regulation: A Critical Interrogation.” Regulation & Governance 12 (4): 505–23.