IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Connecticut Committee Issues Report on Implications of Algorithms

The Connecticut Special Advisory Committee to the U.S. Commission on Civil Rights has issued a report outlining the implications of the use of algorithms and the potential for discrimination.

algorithm_shutterstock_706480585
(TNS) — The Connecticut Special Advisory Committee to the U.S. Commission on Civil Rights Tuesday released a report on “The Civil Rights Implications of Algorithms,” a work that found government algorithms can be discriminatory and biased toward marginalized communities and the negative impact use of them can have on day-to-day lives.

Advisory Committee Chair David McGuire said his committee unanimously voted to take on the project and, in looking at civil rights implications of government use of algorithms in the state, they held four public readings in 2022 to learn more about algorithms and develop models for regulating algorithms to make sure they do not have negative consequences for protected classes and civil rights.

McGuire said key themes became clear as the committee studied the issue, including that algorithms are powerful tools that are going to be a mainstay in the United States moving forward and they can amplify bias and discrimination against protected classes of people and others.

“Algorithms can create or perpetuate discrimination through reliance on data sets that are historically biased, consideration of proxy variables for race, differential accuracy rates between groups, and more,” the report states. “Sometimes this bias is intentional, but more often it is a result of unintentional bias on the part of programmers, historical biases in the data, or the unintentional consequence of giving the program specific goals that fail to account for disparate impacts.”

McGuire said the committee confirmed through the public hearings they held that the state is using algorithms for some automated decision making processes, and some uses of the algorithms by the state are important, for example in education, such as when it comes to who wins the school lottery, or in Department of Administrative Services matters, and choosing job applicants.

McGuire said these are areas where civil rights are really being implicated, because of potential use algorithms that are either using datasets that are flawed or are set up in ways that are professionally biased. Some areas where the committee determined algorithms could be used in a problematic way are: child welfare agencies, the criminal legal system, predictive policing and determining crime rates.

Another key theme McGuire said the committee found is that transparency can be lacking around the government’s use of algorithms in the state.

“I think that we’re at a place where people are excited to latch on to this technology and use it to deliver services to the citizens of the state, but are doing so in ways that are not as well-formulated as they should be,” he said.

He said theFreedom of Information system in the state is not well-suited for the public or the media to get information about government use of algorithms, due to trade secret exemptions.

McGuire said trade secret exemptions to the FOI law can be used as a shield to prevent people from learning about the government use of algorithms.

In response to its findings, the committee developed recommendations for the Legislature to look at ways to make sure algorithms don’t negatively impact civil rights in the state.

The recommendations include:

  • The state should adopt laws and regulations with discrimination in mind: Before algorithms are either purchased, leased or developed, the state should find ways for meaningful assessments on whether there’s a built-in bias for the datasets it plans to use.
  • Once an algorithm is used by the state, the state should have regular independent audits of that system to make sure that it’s being used as intended and the outcomes are fair and not biased.
  • Create an opt-out option and appeal process that includes human decision-makers for people who believe “they have been negatively impacted by individualized automated decision-making algorithms” used by government or a public explanation for why such an opt-out option can’t be done.
  • Meaningful public education by the state to explain what algorithms are, why they are in use, and why they are fair through public education campaigns or to create a dashboard for the public to really look and see which state agencies are using algorithms and why.
  • FOI laws in Connecticut should be revamped to take into account algorithms, along with explicit requirements that the state will disclose information around algorithms.

Brown University’s Professor of Data and Computer Science Suresh Venkatasubramanian, who has helped the committee, said algorithms are being used in every domain that touches our lives.

Algorithms affect health, the ability to get an education, a job, start a business, or find affordable places to live, he said, “they impact our families and our liberty. Their promise is great, but so are the harms. And we’ve seen these harms play out again and again in a pattern that is both familiar and troubling.”

“Biased data feeds into algorithms that exploit and seek out these biases, so as to more accurately mimic historical patterns of discrimination. These algorithms are opaque, inscrutable, and hide their bias behavior amidst piles of mathematics and competition. And the harms fall disproportionately on those who are most vulnerable, most disadvantaged and most likely to already be struggling to flourish,” he said.

New York University School of Law Assistant Professor of Clinical Law and Co-Faculty Director of the Center on Race, Inequality, and the Law Vincent Southerland said algorithms also can carry tremendous hope, as they can have the capacity to help government officials and other critical decision makers make accurate and efficient decisions that can make differences in people’s lives.

However, he also notes the report makes clear caution must be exercised when using the tools.

“These tools rely on data about our past, inform decisions about our present, and shape our future. If you know anything about our past and by extension, our present is that in far too many ways it is steeped in racial inequality. It’s because of that these tools can replicate an unjust past and repel that inequality into the future,” he said.

“That is made worse by the ubiquitous nature of these tools, the fact they are used in places and in making decisions that we don’t even know about often, that far too often operate in secret. And they’re also created by people who created their own biases and values in terms of the way the tools are made, and deployed,” he said.

State Rep. David Rutigliano said the House Republicans support transparency in the AI process and in how the government uses AI and how it impacts citizens.

“We also fully support that the state of Connecticut would protect the citizens’ data, just like the same restrictions on it that would a private company. We are proud of the landmark data privacy law that we passed last session along with the leadership of Senator (James) Maroney. And we think the state should be subjected to many of the same provisions in that law that private companies are,” he said.

State Sen.James Maroney said over the past year, they have run a task force that came out of the state data privacy law to look specifically at AI and how it can be regulated.

“A lot of this is about transparency. What we want to do going forward is ensure that anywhere we’re using algorithms, they are listed on that open data portal so that we’ll know where they’re being used. When we started this work, looking at algorithms and AI, it was really before Chat GPT had come out. And so now, I think AI is on everyone’s radar and everyone’s thinking about that. It’s hard to define what AI is. It could be as simple as a chatbot or it could be something that makes a decision. That’s what we want to look at is these algorithms that are making critical decisions,” he said.

©2023 Hartford Courant, Distributed by Tribune Content Agency, LLC.