DP-3T / documents

Decentralized Privacy-Preserving Proximity Tracing -- Documents
2.24k stars 180 forks source link

DP3T on mobile phones is not decentral #284

Open OBIvision opened 4 years ago

OBIvision commented 4 years ago

The security analysis ignore

E.g. Google do systemic location tracking and are therefor able to link any BLE UUID to both location and person.

Apples focus is more blurred is the business model is more about profiting from monopolizing access to citizens (walled garden).

Remedy: a) Shift to dedicated devices that platforms cannot control b) Incorporate zero-knowledge based identity such as U-Prove to ensure anonymity. Not blockchain

pelinquin commented 4 years ago

At least, remove the SIM card or use an iPod Touch. Also each time the device is doing a GET or POST HTTPS request, the OS should ask user agreement and should display the URL exactly like when you pay with ApplePay, the price is displayed.

lbarman commented 4 years ago

Hi @pelinquin, thanks for the suggestion. Obviously for due to short time constraints we rely on hardware which people already own. Also, I don't know if you saw Google's response.

dspinellis commented 4 years ago

The epidose system, derived from the DP-3T reference implementation, can run on a $10 Raspberry Pi Zero. Based on this implementation, it's easy to build and distribute a wearable device for people who don't want to use a mobile phone or don't have a mobile phone that can run a contact tracing app.

pelinquin commented 4 years ago

Diomidis, YES YES GREAT WORK ! This should be the first priority of all governments I was pushing the Nordic nRf52, but the RPI0 is also great Let's see how to help

dermitza commented 4 years ago

Expanding on the original post somewhat, as I am not really seeing it being discussed (openly) anywhere at the moment - if you do have references to look at, please direct me to them.

Systems that rely on any single-point system are inherently centralised, especially so if we are talking about the Google/Apple contact tracing implementation. The lack of centralisation can only be claimed by a good-will acceptance of the assurances the privacy terms of the API provide.

This becomes highly problematic, especially with the background data aggregation processes of Google/Apple or similar, highly pervasive (to a possible degree of monopolisation) big data service providers, where a main business model component actually is user data aggregation and tracking.

Furthermore, the limitation of usage of their API (one app per country) can be a double-edged sword. On the one hand, it is a noble act, aiming to discourage a flood of thousands of tracing apps and the validation and audit of each single one. On the other hand, widespread adoption of this system, reinforced by such a limitation in place, also ensures there exists a possibility (no matter how infinitesimal) of aggregating contact tracing data together with everything else the term big data includes.

These aggregation processes are observably segregated to a degree that allows them to go unnoticed, and this segregation intentionally or otherwise allows the possibility of loopholes at multiple points. Once any such loophole is discovered or made public, - notice here, this could be years later - it can simply be dismissed with a "we did not know" generalised response, together with privacy policy or implementational updates. Such a response is both acceptable and validated by the observable segregation itself, however any possible privacy damage has already been done, to a sometimes irreversible degree.

A fictional example that might make the above point simpler to understand is using e.g. Google's push service to deliver notifications for DP3T.

Assuming data aggregation processes at Google's / Apple's endpoint, every time a push notification is sent to a user, it can (unknown if it actually is but assumed to be) be logged.

Suddenly you have logs of all push communications between DP3T users and the DP3T backend. At the same time, you have logging of all aggregate account data assigned to that user registered with Google/Apple. This aggregate data can include anything from real-time location, real name, date of birth, home address, to web searches, times they are most active during the day, and all habitual frequencies and preferences (the actual list is unknown but could be very long).

Any good statistical/analysis tool, with a little help from your system description (not strictly necessary) can easily track anything from generalised location heat-maps of activity (having political implications) to very individualised profiles (depending on frequency of push data transmission, push contents (if any), actual payload size between DP3T user and DP3T backend [how the actual payload is being transmitted, if it is either via a similar service or has some identifiable characteristics is open here].

The above described tracking would not be so bad if it this system was existing in a vacuum or provided by a service provider that does not employ user data aggregation and tracking as part of their business model. But in the case of the Google/Apple API, the only, singular, assurance available is the privacy terms of the API.

Mitigating the above single-point failure would require an open audit of the entirety of data aggregation processes that Google/Apple is using, together with verifiability of the claims being audited. This, I believe, is impossible to do, both practically, and at any reasonable time frame.

keugens commented 4 years ago

Expanding on the original post somewhat, as I am not really seeing it being discussed (openly) anywhere at the moment - if you do have references to look at, please direct me to them.

Concerns about big data, centralized data aggregation and tracking are discussed in many places. Open? I dont know. Sometimes I think it should look a lot different than FUD. For reference: Apple / Google have published some pretty detailed documents on what they want to do and what they don't.

On the one hand, it is a noble act, aiming to discourage a flood of thousands of tracing apps and the validation and audit of each single one.

They support each country or region to define their own app. And there are no statements how this should work cross-border. Noble? Not quite in my opinion.

Mitigating the above single-point failure would require an open audit of the entirety of data aggregation processes that Google/Apple is using, together with verifiability of the claims being audited. This, I believe, is impossible to do, both practically, and at any reasonable time frame.

I don't get your point. We have data aggregation since a long time. Some trust when they click Yes on any privacy-confirmation button, some not. I think it is not about time frame, it is not about a one time "do it right now" or "open the door to hell". We need to think about rules, to get a common perception about good and bad practices. Easy to grasp across public and companies. And controlling functions which avoid the well-known hazards we experience with established controlling institutions.

Covid19Fighter commented 4 years ago

Hi,

Google is able to decipher the EphIds, because the handling is done by the API and not by the government app. It is closed source, it got some time to get this info, but you can read it here:

https://github.com/google/exposure-notifications-server/issues/367

I opened an issue as well on the German app:

https://github.com/corona-warn-app/cwa-documentation/issues/102

And questioned also the Bluetooth measures, bcause Bluetooth was never good enough for distance measuring at this level. I did myself some tests, you don't need a Bundeswehr show for this.

https://github.com/corona-warn-app/cwa-documentation/issues/103

Summary:

Yes, it seems Google would be able to get the infected devices and Google works with some of the major players of DP3T since at least 2018. I have warned the GDPR authorities that this is a real problem and asked them to stop the German government servers from broadcasting medical data that Google can decipher.

keugens commented 4 years ago

Google is able to decipher the EphIds, because the handling is done by the API and not by the government app. It is closed source, it got some time to get this info, but you can read it here:

The API will be part of the new Android OS. It may work to some extent even without installing the PHA-app (public health authority authorized by Google).

And questioned also the Bluetooth measures, bcause Bluetooth was never good enough for distance measuring at this level.

It was never designed for distance measurements. Unreasonably high expectations and possible improvements are different topics.

Yes, it seems Google would be able to get the infected devices ...

And this is by design. Or a question of: who should take care for a problem without getting part of the problem.

... stop the German government servers from broadcasting medical data that Google can decipher ...

Do you think commitments by governments are more thrustworthy than those by Google?

Maybe you should publish your complete concept how this app should work

dermitza commented 4 years ago

I don't get your point. We have data aggregation since a long time. Some trust when they click Yes on any privacy-confirmation button, some not. I think it is not about time frame, it is not about a one time "do it right now" or "open the door to hell". We need to think about rules, to get a common perception about good and bad practices. Easy to grasp across public and companies. And controlling functions which avoid the well-known hazards we experience with established controlling institutions.

I feel this reasoning fails at two points

In regards to the argumentation that it is not about "doing it now", i sincerely hope this is the case. However the generic coverage and discussions around these systems (incl. DP-3T) feels more like an arms race rather than what you describe.

To finally answer a question not directed at me originally

Do you think commitments by governments are more thrustworthy than those by Google?

No, I would classify them as having an equal level of trustworthiness.

Covid19Fighter commented 4 years ago

@keugens I am not sure how to solve all these problems and probably if I try they will try to tell my ideas are not good and then go back to what they are doing. Nevertheless I can tell you how this is handled on other similar situations and give some advice that seems straight forward:

keugens commented 4 years ago

@dermitza

... has a somewhat different significance ...

You are right, it could be completly different. If you had regular close contact with a person at risk, you might feel a real benefit in using such an app, which would outweights your concerns about data aggregation. And of course, I have well founded doubts, that a small company would be safer than Google.

... then openly provide it (no matter how partial) to data (aggregation) giants.

There is a clear commitment by Apple/Google not to abuse the data. So "openly provide" and implicitely suggesting there is any alternative free of any risks, is misleading, I think.

In regards to the argumentation that it is not about "doing it now", i sincerely hope this is the case. However the generic coverage and discussions around these systems (incl. DP-3T) feels more like an arms race rather than what you describe.

Sorry, maybe my argumentation was unclear: I think privacy is a issue since a long time. And -- if justified or not -- there are still the same or even increased concerns about privacy and monitoring, And now it is promoted to fix the problem by preventing a single app. This looks highly unreasonable to me.

dspinellis commented 4 years ago

Let me add in this discussion. Both Apple/Google and the governments can under various application architectures abuse their position and act in an untrustworthy way, compromising the privacy of the app's users. They can employ an architecture that doesn't by design guarantee privacy (some governments are doing this), or they can install a backdoor that compromises it. (Backdoors are easier for Apple/Google to install than for governments.) However, if Apple/Google act in an untrustworthy way they can be severely punished and ordered to comply. If a government acts in an untrustworthy way, the options are more limited, especially if this is a non-democratic regime.

keugens commented 4 years ago

@Covid19Fighter

Critical private data is NEVER broadcasted.

I do not consider RPIs and metadata as critical.

The people that got this idea are not well informed, not that wise or are working for Google. You choose.

Generally I have no preference for any company.

Critical private data is stored on well audited data center under government regulation and law!

This may work in some places and for some time. As a matter of fact even democratic governments have interests in industry and private business. And it might be not bad, if there is a promise to achieve more wealth for all citizen by supporting business and economy.

... and several control instances. This happens at banks and telco operators every day. Google is not well audited and it is not really under European regulation. This is why it should never happen there. The control instances make sure that data is secure and data is deleted or not even stored.

So you blindly trust european control instances, And I have well founded reasons to do not.

I think at one point you need to decide: provide an alternative or cooperate with Google. To discredit and nice talk european conditions is not a viable option.

... send the UUIDs ...

Hmm. Not sure about this concept, RPIs or UUIDs by Bluetooth or on server ...

Let the people choose ...

Choice is good. But I would not underestimate that people have some expectation that everything is safe and works what you offer. And not overestimate their skills as IT professionals and lawyers.

OBIvision commented 4 years ago

@keugens @Covid19Fighter

I think at one point you need to decide: provide an alternative or cooperate with Google. To discredit and nice talk european conditions is not a viable option.

I guess this is the very definition of antitrust. When dominating players use their power to force a structure to benefit them at the expense of market and society.

keugens commented 4 years ago

@OBIvision It is the choice right now to get a working app. And power is limited for any company. There are working justice departments, which have the power to enforce things in favor of society. For the society as a whole and not only for a single group.

Dominating at the expense of society? Like any others they want to earn money and doing so, they may go beyond limits sometimes. But there is no reason to believe that they want to fundamentally change society, away from freedom and democracy.

OBIvision commented 4 years ago

@keugens

There are two fundamental points to take from this a) The Google/Apple model is not an anonymous or decentral model. It is a BigTech-in-control model and DP3T do not work in combination with the intense location/positioning and relationship tracking occurring.

b) The inherent motivation is just more of the same - maintaining and augmenting the antitrust models on behalf of shareholders draining society in terms of dysfunctional markets, power concentration of overpaying. In terms of Covid-19, it just doesn't work as the main agenda is to prevent solutions that would reduce the antitrust power (e.g. feeding personal data to Google or opening the walled garden of Apple) as the primary objective while not addressing the many aspects of pandemic mitigation, e.g. anonymous testing and infection investigation.

This is an example of failed governance through overemphasizing one technical aspect taken out of context.

keugens commented 4 years ago

@OBIvision Thanks for letting me know your opinion.

A few suggestions to all:

  1. There is a lot of confusion about terms and definitions. To discuss something and finally finding out that there was no common base of understanding is a complete waste of time. It is up to the participants to select words carefully and not to increase the confusion level.

  2. A publication of DP3T about cooperation with Apple/Google, including detailed reasoning would be helpful.

  3. All members who vote or decide about the further actions and publications by DP3T should transparently report about their involvement in external parties and companies.

  4. An objective comparison of all known app projects could help to do a comparative risk assessment to find out the most appropriate approach and to agree about common standards for the public benefit. I know there is already one document out there, but one from DP3T would be interesting as well.

adam-burns commented 4 years ago

Two issues here:

On 24/05/2020 23:14, keugens wrote:

@OBIvision https://github.com/OBIvision It is the choice right now to get a working app. And power is limited for any company. There are working justice departments, which have the power to enforce things in favor of society. For the society as a whole and not only for a single group.

This is an unrepresentative, biased sample of one opinion, on one justice department, presumably of one country.

Dominating at the expense of society? Like any others they want to earn money and doing so, they may go beyond limits sometimes.

Like any other who or what? Beyond what limits?

But there is no reason to believe that they want to fundamentally change society, away from freedom and democracy.

If "they" (companies, not people) "want" (human desire projected onto 'corporate entity') to "earn money" and may "go beyond limits sometimes", for what reason do you think "fundamental changes", "freedom" and "democracy" are not within their "interests" to change?

OBIvision commented 4 years ago

@OBIvision Thanks for letting me know your opinion.

An analysis is not an opinion. That it trying to turn a problem into ad hominem.

I can give you an opinion - Google and Apple should not have access to keys at all and they should be heavily fined for antitrust and preventing solutions.

2. A publication of DP3T about cooperation with Apple/Google, including detailed reasoning would be helpful.

I hardly see how this would change anything as Google and Apple dictate the setup and prevent alternatives that is not in their interest.

keugens commented 4 years ago

An analysis is not an opinion.

Do you have somewhere your full analysis or alternative concept description at one place?

  1. A publication of DP3T about cooperation with Apple/Google, including detailed reasoning would be helpful.

I hardly see how this would change anything as Google and Apple dictate the setup and prevent alternatives that is not in their interest.

If you are talking about dictating, I would highly recommend to DP3T: make a well founded proposal and send it publicly to Apple/Google. And if this fails, everyone could judge by his own, if it is a dictating action or an action, because of lack of alternatives.

This is an unrepresentative, biased sample of one opinion, on one justice department, presumably of one country.

How do you come to this conclusion? There are legal issues in EU und US. Everyone can verify this by a quick research. What do you expect? A representative study about the behaviour of all countries and companies?

Like any other who or what? Beyond what limits?

Where are you living? There are lots of companies, which operate at the limit or beyond the limit of legality.

And for completness: there is also a limit of good public perception, which usually has a similar effect.

If "they" (companies, not people) "want" (human desire projected onto 'corporate entity') to "earn money" and may "go beyond limits sometimes", for what reason do you think "fundamental changes", "freedom" and "democracy" are not within their "interests" to change?

Because egoism is different from ideology.

A free, educated and democratic society will not allow anyone to enforce a ideologic, unfree and undemocratic system.

For some political groups there are clear signs, that they want to enforce a system change.

For these companies there are no signs. And there is no path from earning money to a system change. To assume it either could be outmost a very vague theory, which -- according to my assessment and the current state of facts -- is not percepted as a real hazard by the society. (Which does not prove there is no hazard, but could be also a sign of a deep gap of misunderstanding between experts and society.)

Apart from this, I fully agree that a monopoly is a problematic issue. And therefore my cautious vision would be: some functions will be governed by democratic instances worldwide and for other functions competitors will come up, eventually enforced by jurisdicton.

Covid19Fighter commented 4 years ago

@keugens : I will not comment the rest, I do not agree on most of your arguments but do not think a discussion will benefit anyone. But here you have some info about cooperation between members of the DP3T and Google: https://actu.epfl.ch/news/epfl-strengthens-its-research-collaboration-with-3/

And also, why Google is not to be trusted with data: https://www.nytimes.com/2019/01/21/technology/google-europe-gdpr-fine.html https://mashable.com/article/bluetooth-is-bad/?europe=true https://qz.com/1169760/phone-data/

keugens commented 4 years ago

There are a lot of recent news/comments/opinions related to the keywords antitrust google. Antitrust Charges Against Google Couldn't Come at a Better Time | Opinion (21 May 2020) https://www.newsweek.com/antitrust-charges-against-google-couldnt-come-better-time-opinion-1505770

And there are other companies affected as well: 16 Ways Facebook, Google, Apple and Amazon Are in Government Cross Hairs (9 Sep 2019) https://www.nytimes.com/interactive/2019/technology/tech-investigations.html?login=email&auth=login-email

Regarding the app, for me the question remains:

... provide an alternative or cooperate ...

And I think there should be ONE viable concept and cooperation should take place with the largest possible coalition, with the power of society supported by justice.

OBIvision commented 4 years ago

Regarding the app, for me the question remains:

... provide an alternative or cooperate ...

Much better alternatives are possible but that would of course never trust the lack of security in mobile phones with built-in spyware for so sensitive applications. One basic point is to move the keys to a dedicated device and focus more on solving the actual problem including e.g. anonymous testing and the massive lack of security in communication with back-end.

And I think there should be ONE viable concept and cooperation should take place with the largest possible coalition, with the power of society supported by justice.

This is where you see the cartel antitrust aspect play out. This attempt to take Covid-19 and anti-central model hostage to create a lock-in to Google/Apple platform control models and business cases.

The anti-central government surveillance is fine, but replacing it with an even worse cartel-monopolized and commercialized model is not.

The real problem is that the need is more targeted solutions but the Google/Apple cartel is trying to enforce an antitrust model with "only one" solution - complex problems require adaptive solutions.

Alternatives would of course have to be interoperable - even with an obviously less-than-perfect compromise solution - on this BLE aspect only. Bad because no security-aware person would leave Mobile Phone Bluetooth on as it is designed to enable tracking. And bad because it by design allows platforms to track people and add time/location/proximity to tracking which is a gold for marketing abuse e.g. intra-shop or drive-to-shop.

E.g. both Google and Apple use Bluetooth and WiFi surveillance for positioning against central databases of stationary transmitters and thus ability to map randomized UUID to location. The problem is not that this makes the entire Covid-19 initiative a cartel antitrust operation in the positioning market, but that it makes it both built-in surveillance by design as party of mobile OS and at the same time Covid-19 force mitigation to become a pre-programed disaster.

keugens commented 4 years ago

Digital Contact Tracing for Pandemic Response: Ethics and Governance Guidance Edited by Jeffrey Kahn, PhD, MPH, and Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020, Book, Published by: Johns Hopkins University Press https://muse.jhu.edu/book/75831

Only one snippet from this:

Technology companies alone should not control the terms, conditions, or capabilities of DCTT, nor should they presume to know what is acceptable to members of the public.

Hopefully more institutions and Google partners will become aware of their responsibility to make a positive change.

Covid19Fighter commented 4 years ago

and here you see why companies like Google, that make their money with data, should not get this particular data:

https://www.zeit.de/wissen/gesundheit/2020-05/covid-19-folgeschaeden-lunge-herz-niere-gehirn-psyche?utm_source=pocket-newtab-global-de-DE

If this data is available at some point for insurances, employers and other parties of interest you will get a Gattaca effect.

keugens commented 4 years ago

@Covid19Fighter I am not ready to put google as a bogeyman and deny them the ability to change positively. They do what their partners and society allows them to do.

keugens commented 4 years ago

... and rejecting digitization and folding your hands is also not a solution for me. Digitization is a multiplier. It multiplies existing justice as well as existing injustice.

OBIvision commented 4 years ago

@Covid19Fighter I am not ready to put google as a bogeyman and deny them the ability to change positively. They do what their partners and society allows them to do.

Lets not confuse evil and greed here. Google would not do anything in their disinterest, but as history tells us that prevent very little.

But protecting a market capitalization of almost 1 trillions USD threatened from demand to re-establish fundamental market processes from surveillance capitalism can drive some pretty extreme measures. Starting with buying politicians over lock-in mechanisms such as the Covid-19 initiatives and hardcoding politics and interests in bad standards to well-funded lobby campaigns - not to mention the free influenzer from content prioritizing.

The issue is not "what society allows Google to do" but what Google can get away with - e.g. the Google/Apple cartel here that has already jammed the Covid-19 efforts into chaos and apperance.

OBIvision commented 4 years ago

... and rejecting digitization and folding your hands is also not a solution for me. Digitization is a multiplier. It multiplies existing justice as well as existing injustice.

Straw man and whataboutism in one.

This is about power to control and profit - here digitization is not just scaling but actively defining and dictating

E.g. when Google block anonymizers aiming at securing search or dictate against the same in Covid-19 efforts

keugens commented 4 years ago

So you could possibly agree: Without the partners of any kind and their willingness to participate in this greed, they wouldn't be there where they are.

OBIvision commented 4 years ago

So you would possible agree to: Without the partners of any kind and their willingness to participate on this greed, they wouldn't be there where they are.

Of course not - greed is a powerful driver and that goes for and drive all ecosystems, even open source and charity if you consider greed as more than material things. If greed is aligned with value creation to citizens without stealing or manipulating in a functioning societal framework (not ideological insinuation) then greed works. If a musician do better economically because he entertains better, all is good - except for the other muscicians that needs to do even better if they want the attention.

But that say little about the legality or morality. Slave trade, botnets, drugs, mafia etc. are all ecosystems in themselves.

And when these combine with abuse of government power into feudal systems or corrupt regimes, the really nightmarish systems emerge.

The relevant part is the involvement of negative externalities.

keugens commented 4 years ago

So you think some evil power can rise up in any modern society and people are all victims and not helpers to let this power arise and stay on?

What negative externalities do you mean?

OBIvision commented 4 years ago

So you think some evil power can rise up in any modern society and people are all victims and not helpers to let this power arise and stay on?

Absolutely. The last 20 years proved that again - worse than ever.

All that was needed was to prevent anonymity and digital ecosystems turned into a winner-takes negative spiral where any action adds to the collapse

What negative externalities do you mean?

Surveillance, profiling, targeting, triggering, manipulation, attacks, security failures, unfair market based on extortion, fear, distrust, filter bubbles, political desception etc. - basically profiting at the expense of others, not in fair competition.

It is not as if we lack examples or victims, everybody feel the negative consequences. Look to e.g. local media.

That is of course focusing on the negative aspects. But point is that the positives could be achieved without the negatives if we just designed differently and human-centric.

keugens commented 4 years ago

So you think some evil power can rise up in any modern society and people are all victims and not helpers to let this power arise and stay on?

Absolutely.

Ok. Then we are very far away from each other. I am looking more than 80 years backwards and my parents are German.

It is the mindset of the society, in the range from indifferent to opportunistic, which brings a devil to life.

It is not a whataboutism, I am just wondering how you want to deal with evil powers in every stage and size.

The last 20 years proved that again - worse than ever.

Not for me. People are happy using smartphones, don't care much about side effects. Governments are happy to get more income, partners of the big players are proud to be so. So in case they will be victims one day, why shouldn't I think they have helped them to grow? Should the people be considered as childs, which are not responsible for their doing?

keugens commented 4 years ago

Talking about victims, not sure what you have in mind. Operation Rubikon: For decades, the BND and CIA have been eavesdropping on encrypted communication from over 100 countries. Operation "Rubicon" has been kept secret until today. It is considered the BND's greatest success. https://www.zdf.de/dokumentation/zdfinfo-doku/operation-rubikon--100.html

OBIvision commented 4 years ago

I guess Arizona says it "It's not the robustness of the controls that is at issue; rather it's the complexity of multiple controls interacting with one another that concerns Arizona. The array of device-level, account-level, and app-level location data controls "misleads and deceives users of Google's products into believing that they are not sharing location information when they actually are," the complaint says." https://www.theregister.co.uk/2020/05/28/paying_arizona_google_sued_by/

keugens commented 4 years ago

Big Tech shows the competition torture instruments! Published on 04/16/2019 | Reading time: 5 minutes By Thomas Straubhaar http://translate.google.com/translate?hl=&sl=de&tl=en&u=https%3A%2F%2Fwww.welt.de%2Fwirtschaft%2Farticle192007047%2FGoogle-Co-Regulierung-von-Big-Tech-ist-besser-als-Zerschlagung.html

The knowledge of the Nobel laureate in economics, Jean Tirole, could provide a direction: One must give the monopolist incentives to behave fairly, but at the same time and unequivocally show him the torture tools that are used uncompromisingly, should market power be at the expense of society, customers and the state be misused.