damus-io / damus

iOS nostr client
GNU General Public License v3.0
2k stars 289 forks source link

Apple NSFW detector: detect and blur sensitive content option #1553

Open alltheseas opened 1 year ago

alltheseas commented 1 year ago

user story

As a Damus user that browses nostr, I would like to have the option of Damus detecting nsfw images, so that I do not see NSFW content on my device.

acceptance criteria

  1. User has the option to turn on or off detect nsfw content feature
  2. If on, nsfw content is automagically detected
  3. If on, nsfw content is automagically blurred (?) (see design discussion)

design

@robagreda does it make sense to "flag" nsfw content, and e.g. blur image, or to not show nsfw identified content altogether?

Might this be two settings: 1) identify and blur nsfw content, and 2) do not show nsfw content?

We could have a setting for that, we add the label with a segmented control to either show, hide or blur. Up to the user

Apple implementation reference

https://developer.apple.com/documentation/sensitivecontentanalysis

origin: https://github.com/damus-io/damus/issues/910#issuecomment-1731633193

Related

NIP-36 https://github.com/damus-io/damus/issues/910

jerihass commented 1 year ago

I'm going to start working on this one this week.

robagreda commented 1 year ago

Great, will work on some design today :) will post the design link here when finish!

alltheseas commented 1 year ago

Great, will work on some design today :) will post the design link here when finish!

My personal preferences:

-for nsfw PFP blur or entirely do not display PFP, show blank/solid color, avatar/emoji (e.g. see robosats), or replace image -for posts with nsfw do not show me at all

robagreda commented 1 year ago

If you want to take the whole settings redesign @jerihass do it :)

https://www.figma.com/file/ORaT1T0Ywfbm0sIjwy5Rgq/Damus-iOS?type=design&node-id=3436%3A32823&mode=design&t=lhs654Fya7JdSWv9-1

CleanShot 2023-10-03 at 16 50 31@2x Something like this could work? @alltheseas @jerihass @jb55

alltheseas commented 1 year ago

Something like this could work? @alltheseas @jerihass @jb55

I would add a "show" option to PFP, as you have with the show nsfw tagged posts.

I would also add a new row titled

"iOS Sensitive Content Analysis"

with options hide, blur, show options.

Question for you @robagreda - if the user chooses blur setting, would they expect to reveal the blurred image with a tap?

jerihass commented 1 year ago

One thing to note:

The Apple framework (SensitiveContentAnalysis) is iOS 17 only.

alltheseas commented 1 year ago

One thing to note:

The Apple framework (SensitiveContentAnalysis) is iOS 17 only.

That's fine. Damus has already accepted nav stack for iOS 17 only is my understanding

robagreda commented 1 year ago

CleanShot 2023-10-03 at 17 36 58@2x Something like this @alltheseas ?

jerihass commented 1 year ago

@robagreda For the emoji selection - is that for the NSFW PFP or all NSFW images? We can basically design the blurred NSFW images or cover them however we want (see images at https://developer.apple.com/documentation/sensitivecontentanalysis/detecting_nudity_in_media_and_providing_intervention_options).

So, the blurred image stuff can be customized and if we want to have any sort of buttons/options we can put those in too. In fact we have to design it all, as the NSFW stuff just analyzes the images and tells us if it is NSFW, then we handle accordingly.

robagreda commented 1 year ago

Yes the emoji is for the PFP only. The control above is to all images or content tagged with NSFW. We will use that blur feature when selected as such using the segmented controls in the settings page.

And great, will design a version of images blurred with options to un-blur maybe only when you tap the button.

robagreda commented 1 year ago

CleanShot 2023-10-03 at 19 41 31@2x

https://www.figma.com/file/ORaT1T0Ywfbm0sIjwy5Rgq/Damus-iOS?type=design&node-id=2070%3A44364&mode=design&t=9njtjYvomMDcS4p8-1

jerihass commented 1 year ago

CleanShot 2023-10-03 at 17 36 58@2x Something like this @alltheseas ?

The last option here -- can you clarify the purpose?

Is it to choose from: -blurred content -emoji covering -override the setting that is in System Setting?

If it is the last option is present I think it could be confusing to users.

Maybe there could be a link to the sensitive content system setting option page?

alltheseas commented 1 year ago

The last option here -- can you clarify the purpose?

If it is the last option is present I think it could be confusing to users.

Maybe there could be a link to the sensitive content system setting option page?

Why do you think the option would be confusing?

Damus previously received feedback from an accessibility setting (I believe it was disable animations) where initially we referred users to iOS settings. The specific request from this set of users was to have an individual setting in Damus settings menu that overrides iOS settings.

jerihass commented 1 year ago

Why do you think the option would be confusing?

When I was thinking through the way the options would be used, I could see a case where a user attempted to enable the filter in the damus settings, but wasn’t enabled in the iOS settings, or where they enabled filtering in iOS settings, and disabled filter in damus settings and they would get the reverse expected behavior.

Further thinking I’m inclined to have some way to show the user that the iOS system Sensitive Content settings are enabled or disabled in this view, and in this fashion the behavior would be more up front.

alltheseas commented 1 year ago

reverse expected behavior.

This is a fair concern.

Further thinking I’m inclined to have some way to show the user that the iOS system Sensitive Content settings are enabled or disabled in this view, and in this fashion the behavior would be more up front.

I like this approach.

Question for you, @robagreda, and @jb55 - do we want to have an altogether not show content labeled as sensitive by iOS (similar to do not show #nsfw tagged posts)?

jb55 commented 1 year ago

On Thu, Oct 05, 2023 at 07:04:08PM -0700, alltheseas wrote:

reverse expected behavior.

This is a fair concern.

Further thinking I’m inclined to have some way to show the user that the iOS system Sensitive Content settings are enabled or disabled in this view, and in this fashion the behavior would be more up front.

I like this approach.

Question for you, @robagreda, and @jb55 - do we want to have an altogether not show content labeled as sensitive by iOS (similar to do not show #nsfw tagged posts)?

no we should show it but just blur and label as "sensitive content" and show a button to unblur like Twitter does it.

jerihass commented 1 year ago

To clarify - I was suggesting a way to notify the user that image content can be screened with the system filter, and I think that we just show or hide the options based on whether the system settings support.

One more thing - sometimes the content filter takes a while to determine if isSensitive is true/false, and we can show a loading or temporary view there.

Most things are screened properly, but others still slip by the settings, unfortunately.

Need to work on profile pictures. IMG_0044 IMG_0045

jb55 commented 1 year ago

On Sun, Oct 08, 2023 at 02:12:29PM -0700, jerihass wrote:

One more thing - sometimes the content filter takes a while to determine if isSensitive is true/false, and we can show a loading or temporary view there.

it should probably be a part of the preloading logic then. We preload stuff before it scrolls into view.

alltheseas commented 1 year ago

On Sun, Oct 08, 2023 at 02:12:29PM -0700, jerihass wrote: One more thing - sometimes the content filter takes a while to determine if isSensitive is true/false, and we can show a loading or temporary view there. it should probably be a part of the preloading logic then. We preload stuff before it scrolls into view.

@jerihass do you need help with the preloading logic?

jerihass commented 1 year ago

@jerihass do you need help with the preloading logic?

Not quite yet - I’ll dig into the kingfisher stuff and see if I can start analyzing the images as soon as they finish downloading and then store the isSensitive result.

jerihass commented 1 year ago

@alltheseas The static and animated images are at a 95% solution; needs some refactoring. One thing I am unable to figure out is what is causing some images to not be analyzed. Initially I thought the analyzer is taking a long time, but now I think that some images are not going through the pre-loader and are missing the analyzing process, but I'm not sure where in the codebase this is happening. Is there a chance some images get called with Kingfisher without being called in the preload image function. https://gist.github.com/jerihass/99663a5489fc60c169bb2dac3700fd88

alltheseas commented 1 year ago

@jb55 do you recall who helped with kingfisher?

alltheseas commented 1 year ago

https://damus.io/note1q5hrvt5wt6sqyg6wrhq07vdfzfacmkrzm5hk6mpu5pdxxswx07jqfmtg3h

Caused by

https://damus.io/note1xjxdkqy50k0dp677ka680wrgzxl64jy79fwcap9k6d9qucc6kcrsmr83nc

alltheseas commented 10 months ago

I wonder how iOS sensntive content detect performance compares to dignifai https://github.com/damus-io/damus/issues/1981