Illustration by Somnath Bhatt

Are we in a Burning House? Using the Black Women Best Framework to Remodel AI

A guest post by Serena Oduro. Serena is a Research Analyst on the Policy team at Data & Society. Her writing and research interests include AI policy and regulation, Black feminism + AI, Black philosophies + AI, and everyday Black life. Twitter: @SerenaOduro.

This essay is part of our ongoing “AI Lexicon” project, a call for contributions to generate alternate narratives, positionalities, and understandings to the better known and widely circulated ways of talking about AI.


“The fact is that the US economy was built in large part upon Black women’s labor and bodies; Black women’s consistent economic insecurity and precarity is a result of deep seated extraction that has lasted for centuries. Centering Black women also means centering other marginalized identities, all of which intersect with race and gender and are identities that Black women hold and live. It means centering trans and queer people; incarcerated and formerly incarcerated people; people with disabilities; and immigrants, documented and undocumented. Centering Black women in policymaking, and all the intersecting identities Black women hold, ensures that Black women can build economic security. But it also means that deeply entrenched racist, sexist, queerphobic, ableist, and xenophobic policies that harm the overall economy for everyone are ultimately unraveled.”¹

AI harms continue to multiply, and they affect Black women disproportionately. From facial recognition software that produces particularly high error rates for dark skinned women to the sexualization of Black women in search engines, Black women have borne the brunt of the harms of AI systems. Calls for fairness, accountability, transparency, and explainability (FATE) within AI development and use have led to the creation of methodological and statistical approaches to fairness meant to curtail harms produced by AI.² However, FATE tools and approaches without an intersectional analysis threaten to reinforce racism and other modes of oppression.³ In this essay, I argue that we must commit to frameworks that center the wellbeing of Black women specifically. I propose that the Critical AI community adopt its own version of the Black Women Best Framework (BWBF), an economic principle coined by Janelle Jones, Chief Economist at the US Department of Labor. BWBF challenges “race-neutral” approaches to economic policy solutions in the US that actually benefit the white workforce, white elites, and white corporations.⁴ BWBF shows that critical analyses can inform who is at the intersections of oppression, and whose needs and desires must therefore be central to policy (especially if they are particularly targeted and oppressed groups by their respective nation state). Due to Black women’s intersectional oppressions of race, sex, class, and many other axes, “race neutral” approaches to policy tend to harm Black women the most. For example, while some segments of the US economy saw improvements following the COVID-19 economic crises, Black women’s employment rates rose the least across all groups.⁵ BWBF requires policy solutions, such as guaranteed income, hazard pay, and a federal jobs guarantee, that ensure that Black women are central beneficiaries of economic policy.

I’ve proposed this framework as an addition to the AI Lexicon to push for more honest policy analysis. Due to the structures of anti-black racism, sexism, and classism, Black women are at the nexus of a multidimensional oppression while our experiences and expertise are often ill captured, misunderstood, and disregarded.⁶ BWBF requires the investment of resources into research about the harm algorithms present to Black women, potential benefits, and which suite of policy tools and regulations protect Black women from algorithmic oppression. BWBF also demands serious consultation with and investment in expert Black women to steer the future of how AI will interact with and impact Black women.

As I entered the field of AI ethics and policy, my views were informed by my policy work at the Greenlinining Institute, a policy institute based in Oakland, CA focused on creating economic opportunity for communities of color. The calls for FATE felt hollow without a dedication to Black people’s, and particularly Black women’s, liberation. Our work highlighted that race-conscious policy was necessary to undo systemic oppression created through policy, such as the digital divide (e.g., unequal access to broadband). Without a race-conscious approach to policy, efforts to address the digital divide and other inequalities miss communities of color and claim to “solve” these issues when they, in fact, do not.

Applying the same lens to fairness metrics, methodologies, and AI accountability policy, these may appear to address algorithmic harms, while in reality algorithmic oppression prospers. For example, without a race-conscious approach to fairness metrics, the strategies to improve the field of AI may actually reinforce and misconstrue the nature of race.⁷ As argued in “Towards a Critical Race Methodology in Algorithmic Fairness,” when race is treated “as an attribute, rather than a structural, institutional, and relational phenomenon, [it] can serve to minimize the structural aspects of algorithmic unfairness.”⁸ Algorithmic unfairness does not arise purely due to human biases or incomplete data sets. It arises due to a system of governance that perpetuates racism and sexism through creating technologies that have evolved from racist practices such as eugenics and surveillance of Black communities.

The very nature of racism has made it so that “race-neutral” approaches to AI benefit white people at the expense of people of color. BWBF challenges the racist norms and beliefs that undergird our economy and provides policy solutions to create an American economy that benefits all. So too must BWBF be adopted in AI to counter and shift the racist, sexist, and classist norms that guide its current development and use.

Through this work, I am calling to give power to the many visions of technology Black women have for ourselves. I am calling for the field of AI to incorporate the knowledge garnered by Black women from our positionality and own intelligence. To explore what Black Women Best would look like for AI, I held a conversation with four Black women who I met through a series on radical Black imagination and technology: Ajay Revels, Mame “Meme” Frimpong, Natachi Mez, and Chibundo Anwuli Egwuatu. I also received an email statement from Bridget Boakye, Internet Policy Lead at the Tony Blair Institute.⁹ My intention is for this article to act as a starting point to consulting with Black women as intellectual producers and not just data points, in order to promote AI policy that centers Black women’s well-being.¹⁰ Through our virtual meeting and email exchanges, I posed the following questions:

What would AI look like if it was guided by Black Women Best?

What would AI look like if its focus was to uplift Black women?

How would this change the purposes of AI, who is in control of it, and the principles that guide it from development to deployment?

Through this process, I highlight three key themes:

1. The relevance of AI is shaken by the Black Women Best framework

AI does not disappear or become irrelevant under BWBF, but its relevance becomes secondary to the needs and desires of Black women. Creating AI and AI policy meant to address the needs and desires of Black women require us to address the systems that oppress Black women: capitalism, anti-blackness, and patriarchy (to name a few). BWBF presents a standpoint that opposes many of the reasons AI has gained traction. For example, BWBF would require tech companies to cease government contracts to surveil and oppress Black people. A commitment to FATE without opposition to systems of oppression will only benefit nationalist and corporate interests and present dangers to Black communities, civil liberties, and the environment. In our conversation, Chibundo Anwuli Egwuatu posed the question, “Is AI a burning house?” And if it is, who will be burned? Black women and others experiencing systematic oppression surely will. Taking a bold BWBF position challenges the current AI ecosystem that perpetuates systemic oppression, and offers an opportunity to develop non-oppressive methodologies and policies.

2. To achieve a Black women best framework for AI development, use, and regulation, Black women’s voices must be centered

Black women who have brought the issue of racial justice and tech to the mainstream with their groundbreaking research are not safe in this field. For example, Google fired Dr. Timnit Gebru for doing just that, and “60 minutes” completely erased the groundbreaking facial recognition research and advocacy of the Black women AI researchers Joy Buolamwini, Dr. Timnit Gebru, and Inioluwa Deborah Raji. In response to the misogynoir that Black women within the field of AI face, Katlyn Turner, Danielle Wood, and Catherine D’Ignazio created an entire playbook that documents this form of abuse.¹¹

The number of Black women in AI is small,¹² and their treatment once they are in it shows that the fight for a just AI cannot lie in the hands of Black women alone. As Bridget Boakye noted, those with power in the field of AI must uplift, invest in, and prioritize Black women’s ideas and scholarship in order to develop an equitable field of work. This includes Black women in technical, policy, activist, and community roles, as AI’s problems are much more than just technical considerations. Without including and investing in Black women, suboptimal and discriminatory impacts will continue to mark the field as deeply flawed.

3. A call for a Black women best approach to AI development is not a direct call for more data on Black women

In response to data’s historical and current use as a weapon to oppress Black communities, the organization Data for Black Lives arose to use data science to create measurable positive change in Black people’s lives.¹³ In Data for Black Lives’ microsite on algorithmic racism, they note that the practice of using data to preserve the power imbalances of capitalism grew from the foundations of slavery.¹⁴ Slave trade corporations relied on massive data sets created by slave traders and plantation owners to maximize profits.¹⁵ Throughout the history of white supremacy, Black people have been heavily surveilled so that Black labor and life could be made profitable. In order for AI to work for Black women, it cannot contribute to, and should actually remain in opposition to, the hyper-surveillance of Black women.

Not only does this challenge bias correction norms in AI that often call for more data, but it also challenges the fundamental nature of AI. For example, it is well documented that facial recognition technology is biased against Black people, in part because of a lack of representation in the data sets that underpin these technologies. While one called for solution is to collect more data on Black people, these very same technologies are disproportionately used against Black communities. A BWBF would work to ban or limit invasive technologies like facial recognition-rather than work to improve a technology that is often used to surveil, disempower, and oppress. As a tool that extracts meaning from historical data, how can AI benefit Black women, when people in and systems of power continuously try to extract from Black women?

Conclusion

The Black Women Best Framework lays out a multifaceted approach to enable AI practitioners to commit to a just and liberatory AI. If fighting against systems of oppression does not become a central goal within the field of AI, then the fight for fairness, accountability, transparency, and explainability within AI will struggle to emerge from the capitalist, nationalist, and white supremacist waters it wades in. If Black women’s voices are not centered, their ideas, concerns, and dreams will continue to be dismissed and discredited while the harms of AI multiply. BWBF provides a path for the critical AI community to remodel AI in a way that is honest about systems of oppression — A path that uplifts the ideas, concerns, needs, and dreams of Black women in order to build technology that is beneficial for all.¹⁶


References:

[1] Bozarth, K., Western, G., & Jones, J. (2020). Black Women Best: The Framework We Need for an Equitable Economy. Roosevelt Institute. p.3.

[2] Algorithmic Equity: A Framework for Social Application; Principled AI Paper Harvard.

[3] Hanna, A., Denton, E., Smart, A., & Smith-Loud, J. (2020) Towards a critical race methodology in algorithmic fairness. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. DOI: 10.1145/3351095.3372826. p.1.

[4] Bozarth, K., Western, G., & Jones, J. (2020). Black Women Best: The Framework We Need for an Equitable Economy. Roosevelt Institute. p.10.

[5] Bozarth, K., Western, G., & Jones, J. (2020). Black Women Best: The Framework We Need for an Equitable Economy. p.10.

[6] Crenshaw, Kimberle (1989) “Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics,” University of Chicago Legal Forum: Vol. 1989: Iss. 1, Article 8. Available at: http://chicagounbound.uchicago.edu/uclf/vol1989/iss1/8

[7] Hanna, A., Denton, E., Smart, A., & Smith-Loud, J. (2020) Towards a critical race methodology in algorithmic fairness. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. DOI: 10.1145/3351095.3372826. p.1.

[8] Hanna, A., Denton, E., Smart, A., & Smith-Loud, J. (2020) Towards a critical race methodology in algorithmic fairness. p.1.

[9] I received consent from all five women to mention their names and ideas from our conversations in this article.

[10] My methodology was also rooted from a stanza I wrote in my piece “Do we need AI or do we need Black Feminisms: A poetic guide”: “I dream of AI being /Crafted by Black hands/ And Black dreams./ If I can sit around the table/ And gush about AI/ With my Mom, Sister/ And Aunties,/ then I’ll believe in it.” Oduro, S. (pending 2021). Do we need AI or do we need Black Feminisms: A poetic guide. Meatspace Press. Black women’s forms of kinship and community should be seen as an important part of knowledge making in the field of AI and academia.

[11] Turner, K., Wood, D., D’Ignazio, C. (2021). The Abuse and Misogynoir Playbook. The State of AI Ethics Report (Jan 2021), 15–34.

[12] Snow, J. (2020, April 2). “We’re in a diversity crisis”: cofounder of Black in AI on what’s poisoning algorithms in our lives. MIT Technology Review. https://www.technologyreview.com/2018/02/14/145462/were-in-a-diversity-crisis-black-in-ais-founder-on-whats-poisoning-the-algorithms-in-our/.

[13] Data for Black Lives. (n.d.). About Us. https://d4bl.org/about.html.

[14] Data for Black Lives. (n.d.). Data Capitalism. https://datacapitalism.d4bl.org/.

[15] i.d.

[16] The Combahee River Collective (1978) A Black Feminist Statement. Capitalist Patriarchy and the Case for Socialist Feminism. Ed. Eisenstein, Z. R. New York: monthly review press.