ncase / covid-19

COVID-19 Futures, Explained With Playable Simulations
Creative Commons Zero v1.0 Universal
232 stars 106 forks source link

Privacy and trust #47

Closed jeantil closed 3 years ago

jeantil commented 4 years ago

In the article there is a nice explanation of the contact tracing protocol and this mention:

they've inspired Apple & Google to bake privacy-first contact tracing directly into Android/iOS. (Don't trust Google/Apple? Good! The beauty of this system is it doesn't need trust!)

While the proposed protocol seems really anonymous and safe, it unfortunately still requires quite a bit of trust. It requires that you trust Google/Apple and/or the app developer to :

Even for open source apps, how can you be certain that the app which is delivered on your phone from the app store was built from that code ? But you need to be able to verify that in addition to all of the above for the binary which is actually running on your device for every single update. If you can't do that you are relying on trust. Not everyone can clone the source code, review it, build it and install the resulting binary on his device (assuming you can even install a binary on your device while bypassing the appstore).

Without the ability to run external audits, you can only rely on the faith that Google or Apple or the app developer will "do no evil". You also trust them to implement it without a bug that would compromise anonymity. You trust they will collaborate so that the apps are actually interoperable and useful. Bordering paranoia, you trust they will resist pressure of governements, american or otherwise, to modify the os/app to serve other purposes. Look at the track record in these various areas and you will see that history has shown multiple times that this trust is misplaced in all of them.

Network traffic analysis is no guarantee either, if the developers, especially at google/apple, want the tokens sent off the phone they can make it insanely hard to detect, and it is almost as likely that they will be sent unintentionally through performance/analytics reports for instance.

About personally identifying information : That's not only sending name, email etc, its also sending (or capturing) technical identifiers that could be used to join with external datasets to deanonymize your tokens. These technical identifiers are literally everywhere. Usually unseen except by developers. One at least is pretty well know by everyone : imei. Such identifiers tend to leak because they are easy and convenient to use as keys and developers love having unique keys And unique identifiers are not all either, if a large enough volume of non identifying information is captured/sent it is possible to narrow down the context to a single person. that's how some analytics company work around cookies by using fingerprinting.

With the current state of technology, tracking individuals anonymously and guaranteeing that anonymity will be preserved through time is actually really,really hard.

In view of all this, writing

The beauty of this system is it doesn't need trust!

Feels very optimistic. instead of encouraging an unwarranted trust, I suggest you encourage people to carry out their own risk/benefit analysis like you do for the rest of the interventions. I would also suggest calling out would be app editors, google and apple to provide the verification path in a way that makes it possible for non developers to be sure of the app they have installed.