privacytools / privacytools.io

🛡🛠 You are being watched. Protect your privacy against global mass surveillance.
https://www.privacyguides.org
Creative Commons Zero v1.0 Universal
3.12k stars 385 forks source link

✨ Feature Suggestion | New Page: Photos/Gallery hosting service alternative in Software Section #2044

Open sandycorzeta opened 4 years ago

sandycorzeta commented 4 years ago

Description

Photos are also a part of major dataset that we hold and always we share with our friends or family. I, myself also storing a lot of picture/photos whether its taken or captured (screenshot) in phone or pc. I've used at least iCloud Photos and Google Photos so far and both are difficult to ditch due how their services are better on organizing my photos right away as a gallery. I've tried Nextcloud and Piwigo as an alternative, both are really good although differs in terms of experience and how they works under the app (especially on mobile client). The two of them deserve a mention in this page section.

As for a review by myself, Nextcloud itself would probably the close and almost near-experience into iCloud/Google Photos. As nextcloud has already a gallery under their app but they only have missing feature which not to be able show and edit EXIF-tags yet. For the Piwigo, its a little bit different with Nextcloud as it tend to be a self-public gallery instead of own private gallery and more into Flickr-like alternative.

And thats all that i know from my own experience. I would like to know too if someone has already tried other photos/gallery hosting services whether its self-hosted or not.

Thanks, Sandy

freddy-m commented 4 years ago

I think crypt.ee could fit in well here, if we end up making this a section.

johnozbay commented 4 years ago

Hello both! 👋🏻

I may be biased, but I think it would be a great category addition to PTIO, and I think Cryptee would fit quite well in this section.

Having spent 3,5 years building and working on Cryptee, I can talk a little bit about the threat models we see most often based on our users' feedback, the types of threat vectors that are out in the wild where self-hosted solutions fall short, and talk about the technical and legal side-effects of on-device encryption of photos & other media.

Most users who come to Cryptee, come in search of a Google Photos, Amazon Photos or iCloud Photos alternative, and their threat model is to simply avoid big tech.

Some –increasingly larger number of– users are in need of a place to keep their photos safe from nation state actors. They wish to safe-keep photos of things like police brutality etc where they're worried regular providers would be too risky of a choice.

--

First I’ll talk about a dark legal topic that I think is crucially important for PTIO team to consider before creating this category.

Based on the number of feedback messages we get from our users, a lot of users are searching for a solution that offers 'sharing' as well, and without sharing features, I don’t think any encrypted & private provider can directly compete with Google Photos, Amazon Photos or alike.

However, nowadays, offering an on-device encrypted photo storage & sharing service presents many legal challenges to platform providers. The biggest one being content like CSAM.

With on-device encrypted platform providers like Cryptee, all data is encrypted on users' devices, and only users themselves can access their data. While on-device encryption is a great solution for privacy and a great defense against lawful (or unlawful) data-requests & potential breaches, it’s not a magic bullet that solves all legal challenges.

If a platform offers photo sharing, E2EE actually complicates things immensely from a legal perspective with takedown requests. With takedowns, there are around ~50 legal scenarios where the platform provider can burn quite badly if they're offering e2ee photo hosting combined with sharing features. [CSAM being one of the worst]

One example scenario. A government can catch a criminal hosting & sharing CSAM on the platform, make them unlock their account after getting caught, and find out who else has access to the shared CSAM by looking into the caught-user's account. Let's say they find out this criminal is sharing CSAM with 5 other users on the same platform.

The next action government has to/would take is to send a takedown request to the platform provider, using their investigative branch (i.e. FBI or the local equivalent from a different country) and they will ask the platform provider to take down the 5 other accounts that also have access to the shared CSAM.

Now here comes the legal problem. The platform provider can't verify whether if these accounts indeed have access to CSAM or not, because the data is encrypted.

So if the platform provider complies & takes down accounts or content, platform provider would be taking the government's word for it. But what if it's someone documenting police brutality, and has nothing to do with CSAM, but the govt wants it gone?

According to attorneys, by law, authorities are not allowed to give anyone the encryption keys of the account they've caught that has CSAM either, because they're legally not allowed to "share" access to CSAM. So platform providers can’t verify the legitimacy of take-down requests via this route either.

And if the platform provider doesn't comply, but it may actually be hosting & enabling sharing of CSAM, the punishment will be immediate jail & shutdown.

And CSAM is just one example. There are many other scenarios (i.e. putting up an unreleased marvel movie poster could cause very expensive litigations too – or terrorist content etc), all of which could have catastrophic consequences for platform providers, and would result in astronomically expensive international court battles. – To add to this, unhappy nation states could also bait platform providers into shutting down the service using this legal mechanism / excuse as well.

So although we have the code for sharing photos ready since the day we launched Cryptee 3,5 years ago, after spending many expensive hours with attorneys, we've decided not to enable sharing until we have a clear legal process in place to defend ourselves against challenges that may arise if we were to start offering sharing photos. Thankfully we have fantastic attorneys who informed us about the potential legal challenges ahead of the time, and we never ran into any issues with authorities yet, but I hope my few paragraphs here shed some light onto how complex this topic is, and some of the legal red flags to watch out for.

To summarize, I would strongly encourage PTIO team to deeply & thoroughly investigate the legitimacy of each company before listing it up on the PTIO site, especially and specifically if the company or platform provider is offering e2ee photos hosting with sharing features. – I can gladly hop on a call to talk to the PTIO team in private to provide even more details & insight about why this topic is very complicated, give some further tips for what to watch out, and finally why you should be incredibly careful before listing any company with sharing features in this category.

--

Onto some less dark topics –

--

Since @sandycorzeta mentioned showing & editing EXIF, I can offer some insight as to why this is very difficult to achieve with on-device encryption.

When you upload a photo to unencrypted services like Google / Amazon / iCloud etc, since your photos are stored unencrypted, these services do all sorts of server-side magic to make your photos easy to serve back to you in a gallery view. (i.e. when you upload a 10mb photo, they scale, crop, generate smaller-size thumbnail images, convert the image to jpg if it's a RAW photo etc) all on the server side. They also extract and format EXIF data on the server, and make it easily searchable & editable.

Whereas with on-device encrypted photo providers (i.e. Cryptee) because your photos are encrypted on your device, and our servers cannot see your photos, we can't do any of this server-side magic. So with Cryptee all photos' thumbnails & preview sizes are generated using the canvas in your browser before getting encrypted & uploaded.

So when you upload a new photo, Cryptee first reads your original sized (O) photo then uses the browser canvas to create : 1 – a cropped, small square thumbnail for the gallery (SM), 2 – a medium sized preview image for the lightbox preview image (M).

Then encrypts both the SM, M, and the original image (O), and uploads all three.

Now, EXIF data remains in the original photos, (which may even be 50mb if it's a RAW photo) and it would be incredibly impractical & slow to show the original sized 50mb photos in the lightbox while you're casually swiping through / looking at the photos. It would take forever to download, decrypt & decode it all. Due to this impracticality / difficulty, no on-device encrypted photo storage provider can easily offer EXIF editing capabilities. Because it would require downloading the photo, decrypting, decoding, editing EXIF on device, then re-encoding, encrypting and uploading each photo. (and I'm not even talking about bulk editing 100 photos' EXIF here, that would be downright impossible without downloading the entire gallery)

We're working on experimental solutions like separating the EXIF from the original on your device before encryption & upload, encrypting EXIF and photos separately, then later glue-ing it back to the original if you download the original at some point. But turns out, although EXIF is well standardized, every camera manufacturer uses a ton of custom tags, and things get real complicated real quick when you try to extract / glue things back together for editing.

[I'm not going to get into other complexities like what happens if your browser canvas is disabled to prevent fingerprinting, then you get all-white thumbnails etc depending on the browser – that's a whole different monster of a topic with lots of interesting angles to consider and discuss]

--

As for self hosted photo management solutions, while they are great for many things like data-ownership, I’d respectfully argue that they’re a bad choice for photo hosting for the average not-so-tech-savvy user depending on their threat model. In fact, it could give a false sense of security and make things way worse in a year or two of hosting.

First, self-hosting assumes that the user is keeping the server's OS and the service hosted on the server up to date. Secondly, to have good security the user would need to know how to set up a firewall, patch for latest security vulnerabilities in OS & service, and can keep the VPS host's credentials safe. Both of these could be fine for the first year or two for most people’s threat models, but without proper system administration, after 1 - 2 years it’s increasingly more likely that a self-hosted server owned by a non-tech-savvy user will become a target.

Or to approach things differently, let’s say user’s self hosting at home, This setup would require the user to have a router that can safely open a port for access and won't cause further security nightmares, (or even have access to the router in the first place, and not living in a place with shared router connectivity like campuses) And if the server is home, the user would need to be careful not spill coffee on it 😅☕️

On top of all this, if they wish to access their photos on the internet, users wouldn't want to use just an IP address, so add the complexity of setting up a DNS for a domain name, which is out of reach for all non-tech-savvy users, and they’d need to make sure their domain name registrar + DNS servers & credentials are all locked down and safe.

Finally, even if they do all this amazingly well, users only get the added privacy benefits of self-hosting if they are also reading the source code of what they’re self-hosting. Otherwise, they’re running something they don't know on their own machine & network (threat vectors for which are way worse if it’s the home network). With all this in mind, if the user is willing to read through source code to verify that their data is indeed private in the first place, then they can save many steps & headaches by reading the source code of managed providers like Cryptee.

Needless to say these are thoughts & opinions jotted down quickly based on my experience running Cryptee, and interacting with our user-base, so other platform founders / users may have differing experiences or opinions, and I'll humbly respect those! 🙏🏻

My hope is that these lines provide some insight to all the legal & technical difficulties privacy-first companies like Cryptee face, help team PTIO have a better, more in-depth understanding of the challenges awaiting privacy companies that would be listed in this category, and help team PTIO make a more technically and legally informed decision armed with all this knowledge.

All the best, and thanks for the service you guys are doing for the community! ✌🏻