Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

Flaws plague a tool meant to help low-risk federal prisoners win early release

A prisoner looks out of his jail window as protesters gather outside the federal detention center in Miami on June 12, 2020, during a demonstration over the death of George Floyd.
Chandan Khanna
AFP via Getty Images
A prisoner looks out of his jail window as protesters gather outside the federal detention center in Miami on June 12, 2020, during a demonstration over the death of George Floyd.

Thousands of people are leaving federal prison this month thanks to a law called the First Step Act, which allowed them to win early release by participating in programs aimed at easing their return to society.

But thousands of others may still remain behind bars because of fundamental flaws in the Justice Department's method for deciding who can take the early-release track. The biggest flaw: persistent racial disparities that put Black and brown people at a disadvantage.

In a report issued days before Christmas in 2021, the department said its algorithmic tool for assessing the risk that a person in prison would return to crime produced uneven results. The algorithm, known as Pattern, overpredicted the risk that many Black, Hispanic and Asian people would commit new crimes or violate rules after leaving prison. At the same time, it also underpredicted the risk for some inmates of color when it came to possible return to violent crime.

"From the beginning, civil rights groups cautioned Congress and the Justice Department that use of a risk assessment tool to make these determinations would lead to racial disparities," said Aamra Ahmad, senior policy counsel at the American Civil Liberties Union.

"The Justice Department found that only 7% of Black people in the sample were classified as minimum level risk compared to 21% of white people," she added. "This indicator alone should give the Department of Justice great pause in moving forward."

The rule of unintended consequences

An American flag flies outside the Department of Justice in Washington in March 2019.
Andrew Harnik / AP
An American flag flies outside the Department of Justice in Washington in March 2019.

Risk assessment tools are common in many states. But critics said Pattern is the first time the federal justice system is using an algorithm with such high stakes.

Congress passed the First Step Act in 2018 with huge bipartisan majorities. It's designed to prepare people in prison for life afterward by offering credits toward early release for working or taking life skills and other classes while behind bars.

Lawmakers like Sens. Sheldon Whitehouse of Rhode Island and John Cornyn of Texas took inspiration from similar criminal justice reforms in states, which they said led to drops in both prison populations and crime. The senators pointed out that some 9 in 10 people in prison eventually return home, and they contended that preparing them for release made good sense for formerly incarcerated people and for public safety.

Only inmates who pose a low or minimal risk of returning to crime can qualify for the programs, with that risk level determined using the Pattern algorithm.

"The significance of this risk assessment tool is that it divides all federal prisoners essentially into two groups: people who can get credit for doing this programming and get out early, and people who can't," said Jim Felman, an attorney in Tampa, Fla., who has been following the First Step Act for years.

The implementation has been rocky. The Justice Department finished the first version of Pattern in a rush because of a tight deadline from Congress.

It then had to make tweaks after finding Pattern suffered from math and human errors.

About 14,000 men and women in federal prison still wound up in the wrong risk categories. There were big disparities for people of color.

"The legislation, I think, came from a good place," said Melissa Hamilton, a professor of law and criminal justice at the University of Surrey who studies risk assessments. "It's just the rule of unintended consequences is not really realizing the impediments it was going to have."

Risk assessment tool "sounds highly technical, but it's not"

"You use a term like 'risk assessment tool,' it has this patina of science, it sounds highly technical, but it's not," said Patricia Richman, who works on national policy issues for the Federal Public and Community Defenders. "A risk assessment tool is just a series of policy decisions."

Those policy decisions are made by determining what counts as a risk factor and by how much.

Criminal history can be a problem, for example, because law enforcement has a history of overpolicing some communities of color. Other factors such as education level and whether someone paid restitution to their victims can intersect with race and ethnicity, too.

In its December report, the Justice Department concluded that some of the disparities could be reduced, "but not without tradeoffs" such as less accurate risk predictions. The department also said using race as a factor in the algorithm could trigger other legal concerns.

Still, it is consulting with experts about making the algorithm fairer and another overhaul of Pattern is already underway.

Attorney General Merrick Garland has directed the department to look for ways to assess racial bias and make the tool more transparent, a spokeswoman said.

One option is to adjust the cutoff points between the risk categories, allowing more prisoners to earn credits for release, which would "maximize access to First Step Act relief while ensuring public safety," she said.

Ultimately, Garland will have to sign off on a new version. Then, Justice has to reevaluate the 14,000 people in prison who got lumped into the wrong category.

"This is just one example of the ways that harmful artificial intelligence systems are being rolled out in everything from the criminal legal system to employment decisions to who gets access to housing and social benefits," said Sasha Costanza-Chock, director of research and design for the Algorithmic Justice League, which studies the social implications of artificial intelligence.

Costanza-Chock said the burden is on the Justice Department to prove the Pattern tool doesn't have racist and sexist outcomes.

"Especially when systems are high risk and affect people's liberty, we need much clearer and stronger oversight," said Costanza-Chock.

The Metropolitan Detention Center prison in Los Angeles is seen on July 14, 2019.
David McNew / Getty Images
Getty Images
The Metropolitan Detention Center prison in Los Angeles

Looking for resolution

Felman, the Florida lawyer working with the American Bar Association, worried that the tool will continue to put many prisoners of color at a disadvantage.

"We will start to see more prisoners get out early," he said. "My concern is that the color of their skin will not be reflective of fairness."

The ACLU's Ahmad said she's seen enough.

"There are no technical fixes to these problems that could make Pattern and similar tools safe and fair to use," Ahmad said. "We would urge the Justice Department to suspend the use of Pattern until it can adequately address these concerns."

Hamilton, who studies risk assessments, thinks the Pattern tool may be worth saving. Consider the alternative, she said: decisions made by people who have all kinds of biases.

"So that's the unfortunate thing is, it's better than gut instinct of the very flawed humans that we all are, and can we improve it more than marginally, and that's what we're all working on?" Hamilton said.

Copyright 2023 NPR. To see more, visit

Carrie Johnson is a justice correspondent for the Washington Desk.