Contact Tracing Apps Won't End The Pandemic

Do they work? Not really. Are they safe? Kind of. Should we make them? Maybe.

I read an article recently where the director of the Alexandria, VA., public health department said their contact tracers have gotten off of phone calls with newly diagnosed COVID-19 patients who had seen more than 60 people in the last two weeks. That's just the people who they could name. There must be at least a handful of strangers who were also exposed, but will never be told to get tested.

In May, Apple and Google released an API that allows app makers to use Bluetooth as a means for contact tracing. It’s a simple idea: your phone has a unique ID, and when you’re within a certain range of someone else with the app, your phones exchange IDs. When someone you've exchanged IDs with self-reports they've tested positive, you get a notification. It’s anonymous, it’s voluntary, it’s instant, it doesn’t use GPS or location data, and it doesn't need to involve the government. Best of all, it solves the problem of not being able to contact strangers.

In theory, contact tracing apps could seriously aid the contact tracing workers by tracing and notifying exposed people automatically, or at least by eliminating the need to talk to the sick person at all. That's a big deal considering the Johns Hopkins Center for Health Security released a statement in April saying the spread of the virus was happening so quickly that the United States would need to hire at least 100,000 new contact tracers to help contain it.

As with most things that sound too good to be true, contact tracing apps come with a lot of caveats and should be approached with caution. I've gone back and forth in my head trying to decide whether I think it is worthwhile to devote resources towards making contact tracing apps right now, or if we should be focusing more on better teleworking/learning technology, and worry about building a solid contact tracing infrastructure when this is all over.

The truth is I still don't know. For every argument against contact tracing apps, there is a decent counter-argument in favor of them. Here is what that looks like:

THE TECHNOLOGY ARGUMENT

It's still fuzzy whether or not Bluetooth is an effective method of determining contact...

Successful contact tracing via Bluetooth depends on few things:

  • High percentage of people use the app
  • Users that test positive promptly self-report or are contacted
  • Bluetooth sensitivity can be limited to a short radius (six feet)
  • Momentary contact (walking past someone) can be eliminated

Now, let's say the app meets all of these requirements and 1 in 3 people in your area have the app. If you're in an area where the virus is spreading fairly quickly, you could be getting a lot of unnecessary pings (and wasting testing resources), since we know that not all exposures are equal in risk.

Image source: xkcd

To reduce unnecessary notifications, we would want the app to filter exposures accordingly. That would be pretty difficult, since the app would need to know:

  • Whether you were outdoors or indoors (and even the level of airflow indoors, since that significantly changes your risk level)
  • If either of you were wearing masks
  • If the person infected was talking, coughing, or sneezing

So, even if the technology worked perfectly, the inability to determine risk level means we would also need rapid, non-invasive, affordable testing that we could all self-administer at home (Those exist!) every day for 14 days.

...but there is a wealth of data we already have that doesn't require Bluetooth or app downloads.

Advertisers use personal data collection to target their ads more precisely, which we usually agree to without thinking much about it. Could we use that same data to stop a pandemic?

Surprisingly, privacy advocates have been rooting for personal data to be used in this way. Here is what Casey Newton, Silicon Valley Editor at the Verge said in an episode of Reset by Vox:

"We have these technologies that are being used for very simple things, like selling us Starbucks or whatever. Why don't you use it to accomplish a public health goal? We've built the system and it works, but we've only used it for kind of the dumbest purpose. Let's try to use it for a smarter one is the argument."

I'll admit, the idea of being notified by your credit card company that you need to quarantine and get tested is kind of weird, but it can actually be done in a relatively private way. Wilbert Van Panhuis, an infectious disease epidemiologist at the University of Pittsburgh, explained the process in an episode of Consequential. The idea is that if someone tests positive at a clinic or hospital, a computer program can then go through their transactions and determine who might have been exposed based on who made transactions right before and after the sick person.

It's a flawed solution now that most stores limit capacity and mark giant X's showing you where to stand while you wait to check out, but the idea that we can use data we already have in an automated way is certainly worth exploring.

THE PRIVACY ARGUMENT

We don't have a clear idea of what constitutes an excessive invasion of privacy...

In a survey conducted in 2019, Pew Research Center found that roughly six out of ten Americans don't think it's possible to live their daily life without having their data collected. They also found that Americans lack an understanding of how their data is used by companies (59%) or the government (78%). It makes sense that very few would be thrilled to have an app on their phone that tracks who they come in contact with.

In the U.S., we don't have much in the way of governing data use and privacy, so deciding what's "too far" is mostly a personal thing and it tends to fluctuate depending on what the data is, who is collecting it, and how it is used. It's circumstantial. For instance, I don't want anyone tracking my location, but if I needed someone to know where I am in an emergency, I would use Google's Personal Safety app, which allows only my emergency contacts to see where I am, and only for 24 hours.

Could we have a national emergency-only use of data? Scott Andes, the Executive Director of the Block Center for Technology and Society, thinks so:

“Could you have an everyday usage regulatory framework and way to consider the trade offs in public data use for public health, and then what it would look like in extreme situations? The reason I like this idea is it lays out the choices ahead of time, it provides a framework for us as a country to be specific about when and where, and it provides particular tools, legislatively and otherwise, to pursue those measures, and those tools have checks to them.”

...but we can let users decide how much privacy they want and give them total control over their data.

Since privacy is circumstantial, we should stop treating it as a one-size-fits-all setting that is frozen in time. Giving users the ability to choose their level of privacy on a spectrum, and allowing them to change it at any moment would go a very long way in gaining their trust.

In a contact tracing app, flexible privacy settings might look like this:

  • Automatically or manually sharing the contact log
  • Choice to send your log to your state's public health department
  • Ability to delete your contact log at any time, or possibly even individual contact logs
  • Allowing you to choose how far back in the contact log you want to send notifications, such as three, seven, or 14 days ago

We could also consider making GPS location tracking an opt-in feature, because some may find it helpful for remembering who they saw. However, since the app works by corroborating two people in contact at a certain time, one person choosing to disclose their location at that time automatically discloses it for the other person as well. This dual privacy issue is something to consider when adding in optional features, like GPS tracking.

THE ETHICS ARGUMENT

Contact tracing apps could be the gateway to mass surveillance...

When technology has made some part of everyday life easier, faster, or automatic, there's a good chance it has cost us some privacy. Convenience is a very persuasive argument that often makes us forget that we're giving up information about ourselves. Timothy Grose explained this in the context of smart speakers in an episode of Reset:

"If you asked people ten years ago, would they want to put in their households a smart speaker that could recognize your voice, that you can do shopping on, it knows your location...ten years ago people would either think that would be impossible or think, ‘oh no, that’s a little bit too weird for me’. It’s almost ubiquitous to the U.S. home now."

It's a very real possibility that a contact tracing app would lower our cultural expectation of privacy, so it's important that we take the time to fully think through its implications. Some questions we must ask ourselves are:

  • Are we making the app because we can, or because we're confident it will save lives?
  • Will users be able to understand exactly how it works and what risks they are taking?
  • Are we shifting the cultural norm in a way that will make it easier or harder for users to be taken advantage of in the future?
  • Do we expect users to just take our word for it that their data is safe?
  • Are there any indications that this app could fall victim to government overreach?

Answering these questions is very hard and likely to be controversial. These conversations will almost certain last longer than the pandemic, but we shouldn't use that a reason to avoid them.

...but this could be an opportunity to set an example for what responsible, transparent technology looks like.

Responsible technology is fair, useful, and trustworthy. Transparency happens Transparency happens when your intent, behaviors, and output are open and aligned. Hitting all of those markers can be surprisingly difficult. Here are some things that would make a contact tracing app responsible and transparent:

  • Start with honesty. During onboarding, explain exactly how it works, where data is stored and for how long, and what does and doesn't get shared.
  • Use clear language. Text should be thorough, yet concise, and the general public should be able to easily understand it.
  • Make it accessible to everyone. The app should be available to iOS and Android users, meet accessibility guidelines, and use inclusive design principles.
  • Don't automate more than necessary. Too much automation can make users feel out of control. Make the most important tasks (like sending your contact log) manual by default.
  • Test the interface for usability. Apps that are easy to use convey trust and  retain users. Test the app on at least five people, and make sure the most important tasks can be completed with ease.
  • Emphasize user control. The settings panel should be directly accessible from the home screen, and remind users of their settings contextually. (ie. 'You've chosen to automatically delete your contacts after 14 days.' when looking at the contact log.)
  • Open source the code. Allowing members of the community to verify that the app works as you say it does is can help foster trust.
  • Don't hide the settings. The settings panel should be directly accessible from the home screen, and remind users of their settings contextually. (ie. 'You've chosen to automatically delete your contacts after 14 days.' when looking at the contact log.)
  • Make sure it works. In instances when it doesn't work, make sure users can report the issue and get help fixing the problem.

Some U.S. states have released their own versions of contact tracing apps already, and it's great to see the efforts being made to build and retain user trust.

  • Utah released an app called Healthy Together that doesn't use the Google/Apple API, and instead uses a combination of Bluetooth and GPS data to speed up the tracing process when someone gets a call from a contact tracer. The app makers, Twenty, said that it can reduce an hour-long phone call down to 16 minutes. They also offer Enterprise and Education versions.
  • Wyoming, South Dakota, and North Dakota used the Google/Apple API and worked with Microsoft to build an app called Care19 Diary.
  • Virginia released COVIDWISE which also uses the Google/Apple API.

So...should we build them?

A simulation study done by Oxford University found that one infection could be stopped for every one to two app users, and that if 60% of the population use the app, we could effectively stop the epidemic. They also conducted a survey with OSF that found out of 6,000 respondents, 67.5% - 85.5% said they would likely install the app.

Image source: OSF

Reality paints a different picture. In early July, The Wall Street Journal found that only 2.7% of people in France downloaded their app, StopCovid. Other countries didn't do much better, with only 6% in Italy, 5% in Denmark, and 14% in Germany.

At the time of writing this, there are 4.83 million confirmed cases of COVID-19 in the United States alone, with the daily increase being around 50,000 cases. Taking all of these numbers into account, along with my own experience designing and launching apps, I have to admit that I feel it is too late for an app to make any sizable impact on the spread of the virus. Perhaps I would feel differently if everyone had access to cheap, at-home rapid tests. I think Cindy Cohn, the Executive Director of the Electronic Frontier Foundation said it best on an episode of Daily Tech News Show:

"How much of our societal energy do we want to put towards something that’s marginal at best as opposed to, you know, hiring humans to do old school contact tracing which does work?"

We could debate privacy and efficacy issues around these apps all day, but I think we should be asking ourselves this instead: who are we trying to help and what is their biggest problem? Do the most vulnerable to this disease even own a smartphone? Will the app only attract those who are already taking precautions, like wearing a mask and staying home? When your target market is an ambiguous "60% of the population," contact tracing apps stop making sense.

While these apps may not have much of an effect on ending the pandemic, our conversations around them have brought to light the many gaps we have in our understanding and control of the public use of personal data. Moving forward we have to start thinking of privacy as a spectrum of choices for the user, and begin the long, hard work of defining guardrails for how to safely use public health data for good. That way, when the next pandemic hits—which I hope won't happen for at least another 100 years—we might be able to use technology to stop it.

Resources for learning more about contact tracing apps:
Consequential
Pandemics, Public Data, and Privacy

This is part one of a two-part episode where Wilbert Van Panhuis, an infectious disease epidemiologist at the University of Pittsburgh; Tom Mitchell, the Lead Technologist of the Block Center and a computer scientist at Carnegie Mellon University; and Scott Andes, the Executive Director of the Block Center, use their expertise to point out the delicate balance between  protecting public health using data collection and protecting personal privacy during an emergency.

Listen on Castbox
Daily Tech News Show
Civil Liberties in the Age of COVID

This episode is actually an excerpt from host Justin Robert Young's other podcast, Politics Politics Politics, where Cindy Cohn, Executive Director of the Electronic Frontier Foundation, challenges the need for contact tracing apps with logical, compelling arguments, and shares the three things she looks at when examining technology. If you're only going to listen to one of these episodes, make it this one.

Listen on acast
Reset
Does stopping coronavirus require more surveillance?

In this episode, Rebecca Heilweil, AI+algorithms reporter at Vox, and Timothy Grose, Associate Prof of China Studies at Rose-Hulman Institute of Technology, examine how the Chinese government has used extreme measures of surveillance to get the Coronavirus under control, and question what effect their methods will have on how the U.S. handles the pandemic.

Listen on Castbox
Reset
Your phone knows if you're staying at home

Casey Newton, Silicon Valley Editor at the Verge, fills us in on how the location data that is already being collected can be used to monitor and fight the spread of the virus, and why privacy advocates are actually in favor of it.

Listen on Castbox
This Week in Virology
Test often, fast turnaround, with Michael Mina

Michael Mina, Epidemiologist, Immunologist & Physician at Harvard School of Public Health & Harvard Med School, explains in very digestible details the trajectory of the virus transmission, and shares how the $1 paper strip saliva tests he created can be used to make reopening the country much, much safer by allowing people to test themselves every day at home.

Listen on YouTube
Download This Show
How Safe is COVIDSafe?

This is a roundtable discussion about how the Australian contact tracing app works, its potential, and what the government could do to make people feel safer about using it. They ask all the right questions and give a lot of clarity to the major issues with contact tracing apps in general.

Listen on YouTube