Subscribe to Pittwire Today
Get the most interesting and important stories from the University of Pittsburgh.Algorithms can influence our lives in ways large and small: where our kids go to school, whether police patrol our neighborhoods and even how long we wait at red lights.
A new report released today by the Pittsburgh Task Force on Public Algorithms, an initiative of the University of Pittsburgh’s Institute for Cyber Law, Policy, and Security (Pitt Cyber), details the potential pitfalls of algorithmic systems and addresses how policymakers can ensure their accountability to the public.
“Agencies across the country, including in our region, are using algorithmic systems to help make government decisions. We need to make sure they are doing so with transparency, accountability and the understanding of the public,” said David Hickton, founding director of Pitt Cyber.
The report is the result of a two-year effort by task force members drawn from local and national experts and community leaders, including a government advisory panel with designees from Allegheny County and the City of Pittsburgh.
[Here’s how Pitt Cyber is addressing election security.]
Put simply, algorithms are computational tools that use data to predict outcomes. Proponents of public algorithmic systems cite significant benefits including more efficient data processing, fewer errors in decision-making relative to humans and the ability to consider vast troves of factors and data at greater speeds.
While the benefits of algorithms are notable, the task force’s report details how algorithms can reflect existing biases in data and our society — and how they can accelerate and exacerbate harm, especially along racial and gender lines.
“We cannot miss this opportunity to do better for our region and make sure that we do not lock in a digital Jim Crow in this front in the fight for civil rights,” said Hickton.
One example of a government’s use of an algorithm gone awry is the case of the Broward County, Florida, court system relying on an algorithm called COMPAS to perform risk assessments of people facing criminal sentencing. The algorithm analyzed an arrestee’s history and demographics and rated how likely they were to offend again. The algorithm was found to falsely flag Black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
“We do not have to accept the false choice between technological advancement and civil and constitutional rights. People of goodwill can find ways to balance liberty and security in the digital age, leveraging tech innovation fairly and with transparency,” said Hickton.
[Read about the 2021 launch of the Pitt Disinformation Lab.]
Against this complex backdrop, the task force endeavored to learn from local governments’ experiences with algorithmic systems. They found a range of approaches with profoundly different commitments to being transparent, engaging the public and obtaining outside systems reviews.
“The work of the Pittsburgh Task Force on Public Algorithms is grounded in a simple truth: algorithmic tools are set up to fail if the public doesn't know what they are or trust how they are being used,” said LaTrenda Sherrill, task force member and founder of Common Cause Consultants who led the task force’s community engagement efforts.
Based on research, lessons learned from case studies and public feedback, the task force outlined recommendations for local and regional governments to manage algorithmic systems and ensure accountable use of algorithmic systems in all agencies. The recommendations include:
- Encouraging meaningful public participation commensurate with the risk level of the potential algorithmic system.
- Involving the public in algorithmic system development plans from the earliest stages through any later substantive changes to the system.
- Utilizing external reviews when the system might be at higher risk of error.
- Assessing whether any planned procurement might include an algorithmic system.
- Publishing information about algorithmic systems on a public registry website.
- Avoiding facial recognition and related systems.
- Evaluating the effectiveness of the recommendations.
The University of Pittsburgh Institute for Cyber Law, Policy, and Security and the Pittsburgh Task Force on Public Algorithms gratefully acknowledge the generous support of The Heinz Endowments and the Hillman Foundation.
— Nichole Faina