What do algorithmic systems and viral pandemics have in common? They both expose and amplify structural social inequalities. They both appear neutral (think about the frequent use of the phrase ‘the virus doesn’t discriminate’), but they both result in outcomes that are worse for those with less power and privilege.
Until the lion learns to write, every story will always glorify the hunter.
In 2019 researchers reported that the Impact Pro algorithm, one of the largest commercial risk-prediction tools used by health-care providers in the United States, was racially biased (Obermeyer et al. 2019). It flagged patients who might benefit from extra care based on how much they were predicted to cost the medical system in the future. Relatively healthy White patients were more likely to get priority over sicker Black patients. This is because Impact Pro relied on previous spending, which was historically lower for Black patients due to systemic inequalities in wealth and access to health care. Importantly, Impact Pro didn’t explicitly use data on race to make its predictions. The data it did use, including age, sex, insurance types, diagnosis and procedure codes, and medications, was taken from historical insurance claims (Obermeyer et al. 2019, 3). Thus previous spending, or “cost,” was used as a proxy for health. But what a person costs the health-care system depends on the barriers they face to access it. For Black patients, these barriers include having less access to transportation, competing demands from jobs or child care, less “knowledge of reasons to seek care,” and less trust in the health-care system (Obermeyer et al. 2019, 4). By using cost as a proxy for health, the algorithm amplified these inequalities without even collecting data on race. This was a clear example of how algorithmic systems that don’t take structural inequalities into account simply make them worse.1
Since our world turned upside down, COVID-19 has become another “system” (a complex, interdependent set of biological, environmental, political, socioeconomic, psychological, media, and technological elements) that brings intersecting inequalities starkly into the light. In the United Kingdom, 89 percent of nurses and 84 percent of care workers are women. As a result, they are at high risk of infection while also often being paid poverty wages and being ineligible for sick pay (Booth 2020). For women migrant domestic workers who are often on insecure contracts, income loss also affects dependents in their home countries (Mlambo-Ngcuka 2020). Gender inequality is being exposed in domestic settings where, globally, women are responsible for the majority of domestic labor and care work. Research from the United Kingdom’s Institute of Fiscal Studies showed that in lockdown, mothers in two-parent households were doing, on average, only a third of the uninterrupted paid work hours of fathers. Of those who were in paid work prior to the lockdown, mothers were found to be 47 percent more likely than fathers to have permanently lost their job or quit (Andrew et al. 2020). Domestic and technology-facilitated abuse is also exacerbated by the virus, disproportionately impacting women globally. The office of the eSafety Commissioner in Australia reports increases of 50 percent in online abuse and bullying, and in the Philippines, the Commission on Human Rights has said peer-to-peer online violence against women and girls has intensified amid the quarantine.
Class inequalities are also blatant. While elite Silicon Valley “preppers” are building bunkers and stockpiling food and high-end ventilators (Williams and Engel Bromwich 2020), others are “sheltering” in unsafe spaces, struggling to access food and pay rent. You can stay home only if you can afford to stay home. Internet access is radically different for the rich and the poor too. Qatar has 100 percent Internet coverage, whereas Sudan has only 30 percent, creating huge disparities, even within one region, between those who can and those who cannot access online resources and education.
Racial inequality is being exposed as well. In the United Kingdom, 70 percent of deaths among health- and social-care workers have been people from an ethnic minority (Rashid 2020). In New York City, coronavirus is killing Black and Latino people at twice the rate of White people (Mays and Newman 2020). The underlying reasons for the disproportionate impact on communities of color in the United States are multiple, including being more likely to live in poverty; having disproportionately high rates of chronic disease; being exposed to higher levels of chemical pollution; having less access to a physician or health-care insurance; having larger, denser, and intergenerational living conditions that make social distancing harder; and having lower-paid, less secure jobs (Beard 2020). As with Impact Pro, race and class are clearly intersecting factors here. The social implications of lockdown measures and government advice do not impact racial minorities equally either. As Ohio-based Aaron Thomas’s viral tweet showed, Black men in America are afraid that recommendations to wear face masks could expose them to racial profiling and police harassment. As he put it, “I want to stay alive but I also want to stay alive” (Thomas 2020).
So how can feminisms—pluralistic manifestations of the fight for gender equity and social justice, which respond and morph in relation to cultural shifts and changes—address some of these injustices? Black feminist scholarship and the creative practices it has informed have already paved the way for understanding the ways that surveillance is racialized. For Simone Browne, surveillance technologies predicated on colonial logics can be challenged and subverted by acts of cultural production in which we can ﬁnd “performances of freedom and suggestions of alternatives to ways of living under a routinized surveillance” (Browne 2015, 8). The art project UNSTOPPABLE by Micha Cárdenas, Patrisse Cullors, Edxie Betts, and Chris Head explores DIY bulletproof clothing created as a collective response to the question “What would technology for Black Lives be?” (Cárdenas n.d.). Dr. Irene Tokini Fubara-Manuel’s 2017 video game Border Ritual 2.0 uses browser-based role play to attempt to intervene in the process of crossing the UK border (Fubara-Manuel n.d.). These acts of the imagination are possible feminisms that can change the way we understand social injustices amplified by technologies.
There have been reports that the pandemic is a disaster for feminism (Lewis 2020), and it is clear that the impacts on women and minority groups are disproportionate. But I prefer to think it is a disaster for the ways societies and technologies have been calcified by oppressive matrices of capitalism, colonialism, heteropatriarchy, and White supremacy. The pandemic is a battle cry for the evolution of feminisms precisely because it reveals our work is not done. We need feminisms more than ever; we need them to transcend borders, to continue to fight for social justice, and to instantiate alternatives to the old order.
What biased algorithmic systems and COVID-19 inequalities also have in common is that they can be and are being called out by cross-cultural, intersectional feminist approaches, through which individuals and communities are developing countermeasures. Feminist initiatives tackling COVID-19 inequalities include the Feminist COVID Response website, a “volunteer online data repository of information on feminist principles and actions, as well as policy responses to the COVID crisis” (Burns and Reyes 2020); the Pan-African Women, Girls & Activists COVID-19 Response Plan, which “coordinates and supports African women and girls in their response towards COVID-19 while influencing policies and state responses to the pandemic” (Pan-African Women COVID-19 2020); and a policy brief from the Centre for Feminist Foreign Policy, which advises on “how governments can implement inclusive, gender responsive emergency policy responses to mitigate the unique and disproportionate effect of the pandemic on already marginalised groups” (Lunz et al. 2020). Feminist initiatives tackling biased algorithmic systems include Feminist.AI, which works collaboratively with community members to “reimagine technologies and build alternatives to increasingly predatory algorithms that put people in danger” (Feminist.AI n.d.) and the Data + Feminism Lab at MIT, which “uses data and computational methods to work towards gender and racial equity” (Data + Feminism Lab n.d.).2 These initiatives share several characteristics: they analyze and highlight unequal gender power relations amplified by the pandemic and/or technologies; they communicate about the impacts experienced by underrepresented groups; they center gender equity and social justice in the process of rebuilding a more just and sustainable global society. The dominant order in industries and governments should be learning from these approaches, so that they avoid “gender-blind” policy and decision-making that underserves the many and privileges the few.
As voluntary surveillance technologies such as contract tracing apps are developed in an effort to curtail the pandemic, I hope that technologists and their employers engage with broader scholarship that highlights how these technologies might disproportionately impact underrepresented communities. Simone Browne, for example, elucidates how contemporary practices of surveillance must be understood in the context of the policing of Black bodies from transatlantic slavery to the present (Browne 2015). Ruha Benjamin shows how algorithmic systems amplify racial hierarchies (Benjamin 2019). This is vital work that should contextualize current concerns about how contract-tracing technologies might discriminate against low-income Black and other people of color, whether it is because they live and work in spaces that are more likely to lead to false-positive test results, because they are disproportionately penalized for violating quarantine measures, or because they do not have access to Bluetooth-enabled smartphones (Landau, Lopez, and Moy 2020). I also hope that companies adapting their facial recognition algorithms to recognize faces with masks turn to Joy Buolamwini and Timnit Gebru’s Gender Shades project, which exposes racial bias in facial recognition technologies (Buolamwini and Gebru 2018). High hopes, perhaps, but I believe they are necessary.
Our motto at Feminist Internet is “There is no feminism, only possible feminisms. There is no internet, only possible internets.” The first half of this motto is not a nihilistic claim that there is no such thing as feminism but rather a series of recognitions: that although common concerns may unify the feminist cause, differences in local axes of power, living conditions, cultural traditions, and political regimes lead to differences in their focus and implication; that dominant narratives need to be dismantled and intersectional histories retold to center the vital activism and contributions of women outside White Western feminism; and that not all women who fight for their rights day to day would label their actions as feminist. But perhaps most importantly for the moment we are living in today, possible feminisms are potential actions toward equity and liberation.
We might think of them first as acts of the imagination, which allow our minds to travel to familiar and unfamiliar places and use what we discover about our differences to illuminate common struggles and forms of brilliance. These travels also encourage us to attend to unequal power relations among groups of women and confront sometimes uncomfortable realities about differences of privilege. We must, of course, question the politics of the imagination, which Ruha Benjamin understands as a contested territory of action. It is, she says, “a resource, a battleground, an input and an output of technology and social order,” and we should be concerned with the fact that most people are forced to live “inside someone else’s imagination” (Benjamin n.d., 2:25). Whose imagination are we forced to live in now that the internet is being understood as a kind of fiber optic panacea for the pandemic? Is it the imagination of institutional centers of technosocial power, promising boundless human connection, access, and productivity in the “new normal”? Yes. Is it the imagination expressed in dominant state narratives? Yes. Are these narratives inclusive and intersectional? No. Do they aim to expose inequality and social injustice? No.
In an interview about the Pan-African Women COVID-19 Online Hub created in response to the pandemic, FEMTECH head Mwanahamisi Singano notes that an African proverb says it all: “Until the lion learns to write, every story will always glorify the hunter” (Kagumire 2020). She emphasizes the importance of intersectional analysis and African feminist scholarship for exposing “white-washed and male-oriented COVID-19 responses.” The Swedish Kvinna Till Kvinna Foundation, which promotes women’s rights in more than twenty conflict-affected countries, implores the government of Sweden to advocate for a feminist perspective in the coronavirus response, noting the importance of cross-border cooperation and support for resource-poor countries. The foundation states: “Sweden’s acclaimed feminist foreign policy is even more relevant for Swedish aid and development cooperation in times of crises” (Kvinna Till Kvinna Foundation 2020). The Hawai’i State Commission on the Status of Women has introduced a “feminist economic recovery plan” that aims to help women, girls, femme-identified and nonbinary people, racialized women/women of color, and Native Hawaiian, Pacific Islander, and immigrant women to recover from economic hardships created by the pandemic. Its executive summary begins: “The road to economic recovery should not be across women’s backs,” a powerful metaphor to open the commission’s call for centering gender in the country’s path toward social justice (Jabola-Carolus 2020). A policy brief from Oxfam India’s Charter of Demands recommends that a feminist approach “with an explicit analysis of gender power relations” should be used in responding to the pandemic (Pitre, Menon, and Jairath 2020). These global calls for feminist approaches signal strongly that we need possible feminisms now more than ever. These efforts have some distinctly feminist threads in common: they all advocate for an analysis of the gender power relations amplified by the pandemic; they focus on the impact on underrepresented groups and how resources can be delivered to them as a matter of priority; they emphasize that the pandemic is an opportunity to center gender equity and social justice in the process of rethinking and rebuilding society; and they emphasize the importance of feminist approaches to policy-making.
When considering how technology can help support those impacted the most by COVID-19, a transnational perspective on questions of equity and access is, of course, vital. A UNICEF study led by Dorcas Erskine, “Not Just Hotlines and Mobile Phones: GBV Service Provision during COVID-19,” outlines the challenges of reaching gender-based violence (GBV) survivors who cannot easily access phone-based support (Erskine 2020). The report notes that although the mainstream media has focused on increases in the number of calls to helplines, providers in some parts of the world are reporting a dramatic drop in calls, which has raised fears that not all survivors are able to safely call for help in situations of confinement and close monitoring by abusers at home. Emerging solutions include adapting existing physical safe spaces into phone booth reporting stations and activating “alert chains” in permitted premises. In the alert chain model proposed by governments and some women’s rights organizations in high-income countries, survivors can approach proprietors such as shopkeepers or pharmacists and use a code word to signal their need for expert support or police services. Erskine notes that these principles “may be transferable in some low income and humanitarian settings” but require “some form of ongoing GBV support infrastructure to be in place to be effective” (Erskine 2020, 4). She suggests that adaptations for humanitarian contexts might include using different alert objects such as colored cloth, which could be included in “dignity kits,” if they are being received; providing retailers with a phone linked to GBV providers; and connecting survivors to key organizations that may hold a “community phone” from which phone-based GBV services can be delivered.
In global activism, research, and creative practice, cross-cultural feminisms are being deployed to challenge injustice and develop technologies that promote rather than destroy equity. But beyond these spheres, possible feminisms are being enacted in daily life, and they too can inspire possibilities for others. As Maimuna Jeng cautions, we must not overlook “women who do not have ‘feminist’ on their bios but are resisting and defying in their homes, schools and workplaces.” We must remember the women “who fight for equality without the theories and contextualisation, or the conferences, workshops and convenings” (Jeng 2020). Possible feminisms can be encoded into these spheres, but they also already exist in places that we might not have expected, and the more we talk about them, celebrate them, and act in response to them, the more likely it is that our stories will glorify the lion, not the hunter.
Dr. Charlotte Webb is co-founder of Feminist Internet, a collective aiming to disrupt inequalities in internet systems, products, and services by working with stakeholders who build and use them. The collective works at the intersection of creative practice, feminism, and technology development. She recently created a master’s degree in internet equalities for the Creative Computing Institute, London, and developed a four-week online course, Design a Feminist Chatbot, for FutureLearn. She is founding director of Even, a consultancy providing creative approaches to tech equity for the next generation of business. She was nominated by the Evening Standard as one of the most influential people in technology and science in London in 2018 and has been widely featured in the international press and has presented her work globally including at TedX, Disruption Network Lab, Reykjavik Global Women’s Forum, Cannes Lions Festival of Creativity, and Barbados Internet Governance Forum.
The good news is that these kinds of algorithmic biases can be and have been corrected. In the case of Impact Pro, the algorithm manufacturer worked with the researchers who identified these biases, and by adjusting the algorithm so it measured a combination of health and cost prediction, they achieved an 84 percent reduction in bias.
The Algorithmic Justice League, which is “building a movement to shift the AI ecosystem towards equitable and accountable AI,” does not use the language of feminism, but I include a reference here, since it is a leading project in the field of AI bias: see Algorithmic Justice League, “Racial Justice Requires Algorithmic Justice. Support the Movement,” 2020, https://www.ajlunited.org/.