Monday, December 29, 2025

The New Jim Code: Ruha Benjamin on How Technology Reimagines Race and Justice

In The New Jim Code: Race, Carceral Technoscience, and Liberatory Imagination, scholar Ruha Benjamin asks a deceptively simple question with deep consequences: What happens when the technologies we build reflect the biases we carry? Her answer is both urgent and subtle—technology doesn’t merely automate decisions; it reshapes social hierarchies in ways that often reinforce old injustices under the guise of innovation and neutrality.

Benjamin’s work is situated at the intersection of science and society, a place where assumptions about objectivity often obscure entrenched patterns of inequality. The “New Jim Code” she describes is a play on the infamous Jim Crow laws—legalized systems of racial segregation in the United States—and yet it signals something distinct: a coded, seemingly technical logic that licenses discrimination through data, algorithms, and “smart” systems that quietly perpetuate the biases of their human architects.


When Neutrality Isn’t Neutral

One of the central themes of Benjamin’s analysis is the myth of technological neutrality. We like to think that algorithms are impartial because they run on code, not sentiment. Yet Benjamin shows how data is never raw; it is collected from social contexts already shaped by inequality. Predictive policing tools, for example, use historical crime data to forecast where crimes are likely to occur next. But when that historical data reflects disproportionate policing of Black and brown neighborhoods, the algorithm doesn’t correct for bias—it amplifies it. The result? “Smart” policing that is anything but smart about justice.

Similarly, facial recognition systems are celebrated as tools of security and convenience, yet perform far worse on darker-skinned faces—so much so that error rates can be several times higher compared to lighter-skinned faces. Here, a technology marketed as universally useful betrays deeply racialized assumptions embedded in its training data and design priorities. The problem isn’t just misclassification. It’s a form of exclusion: who counts as legible to the system, and who remains invisible or misrecognized?


Carceral Technoscience: New Tools, Old Patterns

Benjamin uses the term “carceral technoscience” to describe how technology often extends the reach of punitive systems rather than dismantling them. Bail algorithms, risk assessment scores, and court sentencing tools are frequently adopted with promises of efficiency and fairness. But in practice they can harden disparities in the criminal justice system, transforming subjective judgments into “objective” metrics that are harder to contest.

Imagine a bail algorithm that scores a defendant as high-risk based on zip code, employment status, or prior convictions. Though presented as neutral, these factors are inflected with socio-economic and racial biases. The defendant isn’t just judged by their individual circumstances; they are judged by the social history of their community—a history marked by differential policing, economic exclusion, and unequal investment. Technology, in this sense, does not transcend inequality; it entrenches it.


Liberatory Imagination: Beyond Critique

Yet Benjamin’s work is not only diagnostic; it is generative. She insists that understanding the New Jim Code must go hand in hand with imagining alternatives. What would technologies designed not to reinforce carceral logics but to promote justice and equity look like? This is where her idea of liberatory imagination becomes essential.

A liberatory approach doesn’t start with technology as given, but with people and communities. Instead of asking how to make policing more efficient, we might ask how to reduce harm and build safety through investment in housing, education, and health—areas that technology can support rather than replace human judgment and care. Rather than using risk scores to predict “dangerous” individuals, data could be used to identify sites of need and opportunity. The shift is profound: from managing populations to empowering them.


Why it Matters 

In an era increasingly defined by algorithms—from credit scoring to job hiring systems, from predictive policing to educational gatekeeping—The New Jim Code reminds us that we are not passive recipients of technological change. The values and assumptions we embed in systems today will shape social realities tomorrow.

Ruha Benjamin’s book is a call to intellectual vigilance and collective imagination. It asks us not just to critique the technologies that surround us, but to envision technologies—and social orders—that reflect our highest aspirations for justice, dignity, and shared humanity. In doing so, she turns the question back to us: What kind of future are we coding?


See also: