Shilin-LU / MACE

[CVPR 2024] "MACE: Mass Concept Erasure in Diffusion Models" (Official Implementation)
MIT License
272 stars 18 forks source link

Could you support the detailed process of NSFW prompts and corresponding masked images? #2

Open datar001 opened 2 months ago

datar001 commented 2 months ago

Hi, thanks for this wonderful work! I doubt the process of NSFW prompts and masked images. Because these images do not contain explicit region boundaries. For example, How to mask an image describing concepts of "Hate, Violence, Harm". Do we mask all regions describing the corresponding concepts? Or there are some special operations compared to mask the common object. Thank you very much!

Shilin-LU commented 2 months ago

The prompts are a photo of nudity/a naked person. We do not keep the images but you can generate those images by using SD v1.4 without the safety checker.

Our paper only focuses on the explicit content (e.g., naked person) where they are easily segmented.

For those abstract concepts, they may be hard to be segmented, but you can reduce the activation of the whole image, which is similar to the case of removing artistic styles.