Digging Deeper on the “Tools that Profile, Police, and Punish the Poor”

In mid-September, Insight president Anne Price had the pleasure of sitting down with Virginia Eubanks, an associate professor of political science at the University at Albany, SUNY, to discuss her work and her new book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.

Virginia has worked as a welfare rights advocate and spent the past several years examining how automated social exclusion is growing by the use of predictive models and algorithms that are replacing or augmenting human decision-making in our social welfare system.

The following excerpts from their conversation have been edited lightly for brevity and flow:

ANNE: In your book, you refer to reverse redlining as “rational discrimination.” Rational discrimination does not require bias or resentment, it only requires ignoring bias that already exists. What do you see on the horizon as a result of rational discrimination?

VIRGINIA: We typically think about racial discrimination in the United States in terms of exclusion. We draw a red line around a neighborhood and say you have to stay on the other side of that line. One of the important things about the political moment we are in right now is that while this still happens, there is another form of discrimination that is equally important to address — which is the discrimination that happens when you are included in a system that despises you.

Reverse redlining is drawing a line and saying, come in, come in. This is a challenge to our political imaginary that is based around acknowledging, recognizing, and addressing exclusions.I see this in the work around technology, in the way we frame justice in terms of access. The justice concern around technology and poverty, and technology and race is that people do not have enough access to the tools. What is in the back of my head as I do this work is what does inequality and injustice look like in a system where the participants are despised.

ANNE: I actually worked in the human service sector for a number of years and found that there were so many unwritten rules and hidden practices that are baked into our culture. What was most startling to you about practices that are becoming deeply embedded into algorithms?

VIRGINIA: One of the great tensions in the book is the tension around discretion. Discretion on one side is what kept people of color, particularly African Americans, from receiving any kind of public assistance in the United States well through the 1970s. Discretion has played a deep role in the injustices of the public service system. It makes sense that folks with specific ideas about how bias works would say, ‘you know what we need to do, we need a more objective system for deciding who gets points.’ Rational discrimination is systemic and structural, and is not based on irrational thought patterns and doesn’t require bad intentions to get to bad outcomes.

One thing I struggle with is this tension around discretion. Discretion can be the worst thing that happens to you in a public service system, but the hard truth is that it can also be the best thing that happens in a public service system. Being a welfare rights organizer for 20 years, one of the things that was really clear to me was that bending the rules was one of the only ways to get through the system safely because it is not set up to help you succeed. The public system is set up to keep you off the system. Of course, this opens the door for all kinds of discrimination, which is terrible, but the solution is not to hand that over to an electronic system that is incapable of bending the rules.

There is a historical parallel here with mandatory minimums and sentencing guidelines in the 1980s. The thinking at the time was that we could reduce racial discrimination in criminal justice system by giving judges very strict and rigid frameworks under which to operate. We know now that this didn’t make our criminal justice system less racist, it just made it bigger much faster.

One of my great fears is that we will do the same by removing discretion from the frontlines. Discretion is like that famous quote, “energy is never destroyed or created, it’s only moved.” We are not removing discretion, we are taking it away from the frontline. And if we are doing that, where is it going? It turns out in this case that it is going to economists, data scientists and people around the world who are actually much further away from problems facing the poor than the frontline. It’s not to say frontline decision-making is always good, there are still all sorts of systemic problems with it, but my concern is where we see human decision-making as opaque and unknowable and computerized decision-making as transparent and clear. We feel like computers are more objective, and that being neutral and objective is the same as being just. I believe it’s giving up on the possibility of moving forward together as a community.

ANNE: I have written about tackling deservedness, personal responsibility and anti-blackness. I have referred to this work as a process of truth and reconciliation. In your book you say that “justice requires the possibility of redemption and the ability to start over.” What does that look like and what can move us forward?

VIRGINIA: There two different pieces I want to talk about. First, I want to talk about this narrative work. One of things I think is really important about the book is my attempt to create a narrative around these systems being evolution not revolution. They are grounded in a specific historical context. They come out of the ways we thought about and punish poverty in this country. And that talking about poverty and class is impossible without talking about race. There is technical and political work to be done, but there is also very important cultural work to be done and that is around narrative. And that’s the stuff I am most excited about right now.

One of the things I find inspiring is this very difficult work laid out by The Poor People’s Economic Human Rights Campaign who help people identify with poor as a political identity and how that would shift our politics. We have a story of poverty in America that is one of aberration, something that only happens to a very small percentage of people. But the reality is that 51% of us will fall below poverty at one point in our lives and two-thirds of us will access means tested programs. The reality is that poverty is a majority experience, but we don’t all experience it the same. Yet look at all of the energy and resources we are putting into creating moral thermometers for deciding who is deserving enough to share in our collective wealth. These are not just wasted resources, but criminally and cynically exploitative systems for reproducing this story we tell about poverty. It is time to get to a new place.

--

--

Insight Center for Community Economic Development

The Insight Center for Community Economic Development’s mission is to help people and communities become, and remain, economically secure.