There are several challenges involved in building true privacy-preserving and trust-inspiring contact tracing systems, and many organizations have released principles that should guide the development of contact tracing systems (ACLU, CCC, Data Rights for Exposure Notification). The principles vary somewhat, but in my opinion the most fundamental principle that any digital contact tracing system (whether assisted contact tracing or exposure notification) should follow is giving users the ability to consent.

Participation in any contact tracing scheme should be voluntary. Consent can be practiced at various points:

  1. Choosing to participate in a contact tracing system by installing an app.
  2. Choosing to snooze the app - users should have the option to turn the app off when they know that they are not stepping outside the house (i.e. when they’re sleeping) or if they want to keep their location private.
  3. Choosing to share proximity/location data on diagnosis or survey.
  4. Choosing which server to upload to.
  5. Choosing to redact before uploading data.
  6. Choosing to control granularity of uploaded data.
  7. Choosing to participate in metric collection (though the metric collection would ideally be privacy-preserving: Prio, RAPPOR, Count Mean Sketch).

At any step, a user should be able to withdraw consent at any time.

It is however worth noting the limits of consent: obtaining true consent is a well-known problem amongst privacy researchers; some privacy philosophers even argue that it is fundamentally impossible to get informed online consent. Users typically don’t have the time or understanding to give informed consent (and it doesn’t help that privacy policies are perplexing at best). An app that relies solely on user consent cannot be said to be privacy-preserving. For example, an app could conceivably ask a user “Do you consent to share your entire GPS trail for COVID-19 research purposes?” - and what might be buried in a privacy policy somewhere is the implicit agreement that this GPS location will be used for a location heat-map, opening up the user to being de-anonymized. In this case, the user might have consented, but this system cannot be considered to have preserved the privacy of the user.