'Automating Inequality': Algorithms In Public Services Often Fail The Most Vulnerable

Feb 19, 2018
Originally published on February 20, 2018 9:23 pm

In the fall of 2008, Omega Young got a letter prompting her to recertify for Medicaid.

But she was unable to make the appointment because she was suffering from ovarian cancer. She called her local Indiana office to say she was in the hospital.

Her benefits were cut off anyway. The reason: "failure to cooperate."

"She lost her benefits, she couldn't afford her medication, she lost her food stamps, she couldn't pay her rent, she lost access to free transportation to her medical appointments," Virginia Eubanks tells NPR's Ari Shapiro. Eubanks is the author of a new book, Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.

"Young died on March 1, 2009," Eubanks says. "The next day, she won an appeal for wrongful termination and all of her benefits were restored the day after her death."

Young's story is one of three detailed pictures across the country that Eubanks draws to illustrate that automated systems used by the government to deliver public services often fall short for the very people who need it most: An effort to automate welfare eligibility in Indiana, a project to create an electronic registry of the homeless in Los Angeles, and an attempt to develop a risk model to predict child abuse in Allegheny County, Penn.

Welfare Eligibility In Indiana

With automation, Eubanks says Indiana lawmakers wanted to save money and streamline the state's welfare system.

"But the way the system rolled out, it seems like one of the intentions was actually to break the relationship between caseworkers and the families they served," the author says.

In promoting the contract, she says, the governor, she says, kept pointing to one case to suggest that a system that lets caseworkers and families develop personal relationships invites fraud.

"There was one case where two caseworkers had colluded with some recipients to defraud the government for about $8,000," she says. "So what happened is the state replaced about 1,500 local caseworkers with online forms and regional call centers. And that resulted in a million benefits denials in the first three years of the experiment, which was a 54 percent increase from the three years before."

But, Eubanks says, automated public service systems that serve those living in poverty or with poor health are not inherently less effective than mainstream automated services like Uber or Lyft. Rather, she worries that these systems are used "as a kind of empathy override."

"One of my greatest fears in this work is that we're actually using these systems to avoid some of the most pressing moral and political challenges of our time — specifically poverty and racism," she says.

Resource Allocation For The Homeless In Los Angeles

Eubanks says these tools are being used to outsource hard decisions to machines — including the allocation of housing in Southern California.

"So there are 58,000 unhoused folks in Los Angeles," she says. "It's the second highest population in the United States and 75 percent of [those unhoused] are completely unsheltered, which means they're just living in the street."

"I do not want to be the caseworker who is making that decision, who is saying there are 50,000 people with no resources, I have a handful of resources available, now I have to pick," she says.

Still, automation is not the solution here, Eubanks says. To underline the point, she cites public interest lawyer Gary Blasi in her book: "Homelessness is not a systems engineering problem, it's a carpentry problem."

In other words, if you've got 10 houses for 20 people it doesn't matter how good the system for housing those people is — it won't work.

That's not to say automation doesn't have an important role in helping limit failures caused by caseworkers "who are racist, who discriminate, who favor some clients over others for inappropriate reasons," Eubanks says.

"Human bias in public assistance systems has created deep inequalities for decades," she says. "Specifically around the treatment of black and brown folks who have often been either overrepresented in the more punitive systems or diverted from the more helpful systems."

These inequalities can manifest in a number of ways. People of color are more likely to go to prison, have their children taken away from them or not receive public housing.

"But the thing that's really important to understand," the author notes, "these systems don't actually remove that bias, they simply move it."

A Child Welfare Risk Model In Allegheny County, Penn.

One case of this bias displacement is found in Pennsylvania, Eubanks says. In Allegheny County, the Department of Human Services employs a predictive algorithm aimed at projecting which children are likely to become victims of abuse.

"In that case, one of the hidden biases is that it uses proxies instead of actual measures of maltreatment," she says. "And one of the proxies it uses is called call re-referral. And the problem with this is that anonymous reporters and mandated reporters report black and biracial families for abuse and neglect three and a half more often than they report white families."

Eubanks knows she could have turned out a pretty portrait of three different automated systems elsewhere in the country that were providing services effectively. But she says wanted to give a voice to the vulnerable people — families to whom she said these systems looked "really different than they look from the point of view from the data scientists or administrators who were developing them."

"I wasn't hearing these voices at all in the debates that we've been having about what's sort of coming to be known as algorithmic accountability or algorithmic fairness," she says.

Eubanks says policymakers can look to successful models when implementing an automated system. "In Chicago there's a great system called mRelief," she says. "mRelief basically allows you to sort of ping government programs to see if you might be eligible for them. And then the folks who work for mRelief actually help step you through — either in person or through text — the process of getting all the entitlements that you are eligible for and deserve."

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

ARI SHAPIRO, HOST:

In the fall of 2008, an Indiana woman named Omega Young got a letter saying she needed to recertify for the state's public benefits program.

VIRGINIA EUBANKS: But she was unable to make the appointment because she was suffering from ovarian cancer.

SHAPIRO: She called the local office to say she wouldn't make the appointment because she was hospitalized getting cancer treatments and she lost her benefits anyway. The reason - failure to cooperate.

EUBANKS: So because she lost her benefits, she couldn't afford her medications, she lost her food stamps, she couldn't pay her rent. She lost access to free transportation to her medical appointments. And Omega Young died on March 1, 2009. And on the next day, she won an appeal for wrongful termination and all of her benefits were restored the day after her death.

SHAPIRO: This is one of the stories the author Virginia Eubanks tells in her latest book "Automating Inequality: How High-Tech Tools Profile, Police, And Punish The Poor." That book is the subject of this week's All Tech Considered.

(SOUNDBITE OF MUSIC)

SHAPIRO: Virginia Eubanks argues that many of the automated systems that deliver public services today are rigged against the people these programs are supposed to serve. She dives deep into three examples of automated public services - welfare benefits in Indiana, housing for the homeless in Los Angeles and children's services in Allegheny County, Pa., which includes Pittsburgh.

The Indiana case was so bad that the state eventually gave up on the automated system. Virginia Eubanks started by telling me what state lawmakers were trying to accomplish through automation.

EUBANKS: Indiana was attempting to save money and to make the system more efficient. But the way the system rolled out, it seems like one of the intentions was actually to break the relationship between caseworkers and the families they served. The governor sort of did a press tour around this contract. And one of the things he kept bringing up was there was one case where two case workers had colluded with some recipients to defraud the government for about - I think it was about $8,000.

And the governor used this case over and over and over again to suggest that when caseworkers and families have personal relationships, that it's an invitation to fraud. So the system was actually designed to break that relationship. So what happened is the state replaced about 1,500 local caseworkers with online forms and regional call centers.

And that resulted in a million benefits denials in the first three years of the experiment, which was a 54 percent increase from the three years before.

SHAPIRO: Is an automated system of public services inherently going to be less helpful, less effective than something like Uber or Lyft or Amazon or all the automated things that people who are not in poverty rely on every day?

EUBANKS: No. There's nothing intrinsic in automation that makes it bad for the poor. One of my greatest fears in this work is that we're actually using these systems to avoid some of the most pressing moral and political challenges of our time, specifically poverty and racism. So we're kind of using these systems as a kind of empathy override. You know, let's talk about Los Angeles.

So there's 58,000 unhoused folks in Los Angeles. It's the second-highest population in the United States and 75 percent of them are completely unsheltered, which means they're just living in the street. I do not want to be the case worker who is making that decision, who is saying there's 50,000 people with no resources. I have, you know, a handful of resources available. Now I have to pick.

But the problem is that we are using these tools to basically outsource that incredibly hard decision to machines.

SHAPIRO: So the underlying problem is not that the housing system is automated but it sure doesn't help that automating that system allows people to ignore, more or less, the fact that there are not enough houses.

EUBANKS: Yeah. So one of the folks I talked to in the book, this great, brilliant man Gary Blasi has one of the best quotes in the book and he says, homelessness is not a systems engineering problem. It's a carpentry problem, right?

SHAPIRO: If you've got 10 houses for 20 people, it doesn't matter how good the system for housing those people is, it's not going to work.

EUBANKS: Exactly.

SHAPIRO: As you point out in the book, caseworkers have biases. There are case workers who are racist, who discriminate, who favor some clients over others for inappropriate reasons. Doesn't automation have the potential to solve those problems?

EUBANKS: Yeah, let's be absolutely direct about this that human bias in public assistance systems have created deep inequalities for decades. And it's specifically around the treatment of black and brown folks, who have often been either overrepresented in the more punitive systems or diverted from the more helpful systems because of frontline caseworker bias.

SHAPIRO: So they get thrown in prison more often or their children taken away more often, they get public housing less often, that sort of thing.

EUBANKS: Exactly. But the thing that's really important to understand about the systems I profile in "Automating Inequality" is that these systems don't actually remove that bias, they simply move it. So in Allegheny County where I look at the predictive model that's supposed to be able to forecast which children will be victims of abuse or neglect in the future, in that case, one of the hidden biases is that it uses proxies instead of actual measures of maltreatment.

And one of the proxies it uses is called call re-referral, which just means that a child is called on and then a second call comes in within two years. And the problem with this is that both anonymous reporters and mandated reporters report black and biracial families for abuse and neglect 3.5 times more often than they report white families.

SHAPIRO: You draw these three detailed pictures of automated systems falling short in Indiana, California, Pennsylvania. Do you think a different author could have found three different automated systems somewhere in the country that were working really well in providing services effectively?

EUBANKS: Absolutely. One of the things that's different about the way that I wrote the book is that I started from the point of view of the targets of these systems. It doesn't mean I only spoke to those folks. But I spoke to, you know, unhoused folks, both those who have had luck getting housing through coordinated entry and those who haven't. I spoke to families who have been investigated for maltreatment.

And I will say that when you start from the point of view of these very vulnerable families, that these systems look really different than they look from the point of view of the data scientists or administrators who are developing them. And I wasn't hearing these voices at all in the debates that we've been having about what's sort of coming to be known as algorithmic accountability or algorithmic fairness.

I was never hearing the voices of the people who face the pointy end of the most punitive stick. And I really thought it was important to bring those stories to the table.

SHAPIRO: Virginia Eubanks, thanks so much for talking with us.

EUBANKS: Thank you so much.

SHAPIRO: Her book is called "Automating Inequality: How High-Tech Tools Profile, Police, And Punish The Poor." Transcript provided by NPR, Copyright NPR.