db0 / fedi-safety

A script that goes through a lemmy images in storage and tries to prevent illegal or unethical content
GNU Affero General Public License v3.0
63 stars 6 forks source link

Request: add the ability to quarantine images, instead of deleting them #5

Open csolisr opened 10 months ago

csolisr commented 10 months ago

Some jurisdictions may require storing CSAM, instead of deleting it, for law enforcement purposes. In this case the material must be securely stored and its access limited. See https://github.com/iftas-org/resources/tree/main/CSAM-CSE#reporting for details.

This tool currently deletes the detected potential CSAM, which as described above, may fall into a violation of the law for certain jurisdictions. In order to comply with regulations, this tool should have the ability to:

poVoq commented 10 months ago

IANAL, but this only applies if you are aware of the specific CSAM on your server. So if this system automatically scans and deletes it before you are aware of it, there is no requirement to quarantine it as evidence. In the end it is no different than bulk deleting any other file then.

Which is kinda the point, as interacting with law enforcement on this can get you in hot waters, even if you approach it with best intentions. In Germany for example, doing what you describe can land you in prison even if you only stored it for the purpose as evidence. They seek to revise that law as it extends to totally innocent people ( some details in German here ), but this hasn't happened yet.

I think in many jurisdictions the safest is to delete before you are even aware (which this tool can do), as it is the quickest and what you don't have can't be used as evidence against you.

db0 commented 10 months ago

I can add this functionality as an optional thing, but keep in mind because this tool will catch mostly false positives, you will end up with thousands of images you would have to filter through manually.