Some of the grants had yet to be dispersed as of last week, but a news release on the foundation’s website breaks down the five recipients as follows:
- $300,000 over two years to the Roger Baldwin Foundation, a division of the American Civil Liberties Union in Illinois, to see that contact tracing technology and data in the state are equitable and protective of civil rights and liberties
- $300,000 over two years to the Crossroads Fund, a Chicago-area nonprofit, for launching a network of technologists and local organizers in Chicago to gather input from various communities on how contact tracing tools should be used and deployed, with sensitivity toward racial justice and unequal access to these tools
- $250,000 over 18 months to the Berkman Klein Center for Internet and Society at Harvard University to research what kinds of policies and practices around contact tracing have successfully promoted equity, inclusion and replicability, particularly in the state of Massachusetts
- $500,000 over one year to the Social Science Research Council, a U.S.-based international nonprofit, to support the council’s Just Tech program and the international Public Health, Surveillance, and Human Rights Network, which will research how contact tracing can be done equitably and without violating people’s rights
- $250,000 over one year to The Citizen Lab at the University of Toronto to expand its research tracking various tracing app laws and tools around the world, doing forensic analysis and creating reports about risks and impacts on historically marginalized communities
“At the most basic level, the apps are dependent and predicated upon the use of smartphone technology,” he said. “While smartphone technology is becoming more widespread, there’s still unequal distribution of it, and if you look at where pickup is lowest, it includes historically marginalized communities in the U.S., and we know that those populations … are being disproportionately impacted by the COVID crisis.”
Sears said the foundation has been listening closely to historically marginalized groups about what he called “the twin pandemics of COVID-19 and anti-black racism.” He said their concerns — compounded with universal worries about how data is used, who can access it, and where and how long it’s stored — prompted the $1.6 million investment.
“We were listening to questions that we felt policymakers were struggling with in relation to the use of technology in responding to COVID-19,” he said. “And also, as a global foundation, we’re interested in looking for opportunities to create strong connections between what’s happening in different parts of the world and how it might impact us here in the United States, and even locally in Chicago.”
Equity
At Harvard University’s Berkman Klein Center, Executive Director Urs Gasser said one persistent problem around the world with these technologies has been fragmentation of expertise: people who know a lot about tech don’t necessarily know about public health, and vice versa. There are also civil liberties issues, but experts in that field might not be in communication with technologists or public health experts.To address this problem, the Berkman Klein Center is creating a working group and funneling their expertise into a program called BKC Policy Practice: Digital Pandemic Response. The primary audience for this will be local governments.
“Think of it like a walk-in legal clinic. You can bring your question or problem, and we can group around that problem, bring experts together from these different disciplines, and work toward solutions that are in the public interest to present to different stakeholders,” Gasser said. “One of the early takeaways across some of these conversations is the importance of trust, and how hard it is, especially in a moment of crisis like the current one, to create trust. Unfortunately there isn’t too much trust already, given the circumstances. That trust formation … needs many stakeholders.”
Gasser pointed to Switzerland as an example of a government trusted by its citizens that appeared to have done everything right — rolling out a proximity-tracing app with Google’s secure framework, conducting surveys beforehand, discovering that people were more likely to trust an app if it was branded by the federal health agency instead of a university or tech company — and they still didn’t quite embrace it.
“One of the reasons, when surveying people after the release, is, there’s still some doubts about what happens to the data, what is the role of Google and Apple and all of that. Second … people weren’t ready to make decisions, like, do I automatically stay home for two weeks if I don’t have any symptoms?” he said. “So it’s fascinating that even under the best circumstances, there are these trust concerns. Now take it to an environment like here in the U.S., in the current conditions, it’s unbelievably complex and has a long history.”
Gasser conceded that the big challenge is how to make these apps available and useful in low-income communities that need them most; how to ensure equity of a pandemic-fighting tool that depends on an inequitably distributed resource like smartphone technology. He didn’t have an answer, but he stressed the importance of bringing disadvantaged communities to the table for discussions. Members of the Berkman Klein Center also wrote an opinion piece in the New York Times in July, calling on governors to team up and devise testing infrastructure that’s more accessible to underserved communities.
Gasser said that when it comes to convincing people to download apps, it’s not only about the message but the messenger — who is trusted by the community that needs to receive the message.
“These are structural inequalities that have a long history, they’re systemic, and you can’t just take a solutionist approach to it, whether it’s an app you want to develop or anything else. It’s incredibly complex,” he said. “In terms of the overall approach, I think what we observe globally is, it’s really working with communities and including communities in the development process of whatever the tool is that you’re developing, whether it’s an app, a workflow, an awareness campaign. Make sure you have representative and inclusive design processes.”
Security
At the University of Toronto’s The Citizen Lab, which does interdisciplinary research on technology and security issues that impact human rights, the new $250,000 grant will have a more international scope. The lab’s founder and Director Ron Deibert said his efforts encompass four focus areas, the first of which will examine censorship on Chinese social media platforms such as WeChat and YY. The second will consider emergency measures and policies related to COVID-19 in Southeast Asia, evaluating whether they empowered government or military agencies in ways that put civil society at risk. The lab has already been doing that, but the grant may expand that work into other regions, including the Middle East, North Africa, Latin America and others.“Many of these countries are already democratically challenged in various ways,” he said. “The supposition here is that the pandemic is being used as a way to bolster those forms of illegitimate rule, so we want to track that very carefully.”
Deibert said the lab’s third focus area is disinformation, for which they’ve already done some work tracking government-sponsored campaigns in Russia and Iran.
“Obviously there’s a lot of disinformation going on around COVID-19. A lot of people are tracking this already,” he said. “We haven’t started looking into that in a systematic sense, but once we get the grant, I’ll probably advertise for a research fellow or postdoctoral fellow with expertise in that area to track a course and see what we can uncover.”
Last but not least, The Citizen Lab will look at companies in and outside the U.S. that are stepping forward with contact tracing apps and location tracking services. Deibert said the private industry of location tracking tools is not, generally speaking, well regulated and secure. He said many companies don’t put as much effort into securing their own data gathering practices as they should, so data breaches are common.
To evaluate these apps, Deibert said, The Citizen Lab downloads them, reverse-engineers them, examines their network traffic and generally tries to get a sense of the architecture and design principles that went into it. For example, he said the lab looked at a contact tracing app from the United Arab Emirates that Amnesty International was interested in, and found it was more or less a form of spyware.
But besides suspicious software from the UAE or the Philippines, what of western democracies? Deibert said some of the lab’s concerns for the west have to do with unintended consequences — for instance, what might happen if governments normalize the idea of constantly monitoring people’s health and sending that data to companies, law enforcement or other third parties.
“We already know that the use of algorithms, for example by police and law enforcement, can really exacerbate existing types of discrimination around racialized policing. COVID is hitting marginalized communities already worse than others … so you can foresee things like immunity passports, or technology that’s used to monitor workers as they go into places of work, that could be coincidentally used to further forms of discrimination or labor practices that are unfair,” he said. “With all of this data being collected, to what degree will it be used for national security purposes that don’t have narrowly to do with the pandemic?”
As with the other MacArthur grant recipients, The Citizen Lab’s contribution to safe and equitable contact tracing will mostly be research. Deibert said the lab publishes independently on its own website, and also in peer-reviewed journals and other academic platforms. He doesn’t see a sunset for this research any time soon.
“Many of these applications, once installed, won’t likely go away,” he said. “There are new practices forming around policing and emergency measures and surveillance technologies that will be with us after the pandemic subsides. So we’ll be tracking this for a while.”