Open alltheseas opened 1 year ago
I'm going to start working on this one this week.
Great, will work on some design today :) will post the design link here when finish!
Great, will work on some design today :) will post the design link here when finish!
My personal preferences:
-for nsfw PFP blur or entirely do not display PFP, show blank/solid color, avatar/emoji (e.g. see robosats), or replace image -for posts with nsfw do not show me at all
If you want to take the whole settings redesign @jerihass do it :)
Something like this could work? @alltheseas @jerihass @jb55
Something like this could work? @alltheseas @jerihass @jb55
I would add a "show" option to PFP, as you have with the show nsfw tagged posts.
I would also add a new row titled
"iOS Sensitive Content Analysis"
with options hide, blur, show options.
Question for you @robagreda - if the user chooses blur setting, would they expect to reveal the blurred image with a tap?
One thing to note:
The Apple framework (SensitiveContentAnalysis) is iOS 17 only.
One thing to note:
The Apple framework (SensitiveContentAnalysis) is iOS 17 only.
That's fine. Damus has already accepted nav stack for iOS 17 only is my understanding
Something like this @alltheseas ?
@robagreda For the emoji selection - is that for the NSFW PFP or all NSFW images? We can basically design the blurred NSFW images or cover them however we want (see images at https://developer.apple.com/documentation/sensitivecontentanalysis/detecting_nudity_in_media_and_providing_intervention_options).
So, the blurred image stuff can be customized and if we want to have any sort of buttons/options we can put those in too. In fact we have to design it all, as the NSFW stuff just analyzes the images and tells us if it is NSFW, then we handle accordingly.
Yes the emoji is for the PFP only. The control above is to all images or content tagged with NSFW. We will use that blur feature when selected as such using the segmented controls in the settings page.
And great, will design a version of images blurred with options to un-blur maybe only when you tap the button.
Something like this @alltheseas ?
The last option here -- can you clarify the purpose?
Is it to choose from: -blurred content -emoji covering -override the setting that is in System Setting?
If it is the last option is present I think it could be confusing to users.
Maybe there could be a link to the sensitive content system setting option page?
The last option here -- can you clarify the purpose?
If it is the last option is present I think it could be confusing to users.
Maybe there could be a link to the sensitive content system setting option page?
Why do you think the option would be confusing?
Damus previously received feedback from an accessibility setting (I believe it was disable animations) where initially we referred users to iOS settings. The specific request from this set of users was to have an individual setting in Damus settings menu that overrides iOS settings.
Why do you think the option would be confusing?
When I was thinking through the way the options would be used, I could see a case where a user attempted to enable the filter in the damus settings, but wasn’t enabled in the iOS settings, or where they enabled filtering in iOS settings, and disabled filter in damus settings and they would get the reverse expected behavior.
Further thinking I’m inclined to have some way to show the user that the iOS system Sensitive Content settings are enabled or disabled in this view, and in this fashion the behavior would be more up front.
reverse expected behavior.
This is a fair concern.
Further thinking I’m inclined to have some way to show the user that the iOS system Sensitive Content settings are enabled or disabled in this view, and in this fashion the behavior would be more up front.
I like this approach.
Question for you, @robagreda, and @jb55 - do we want to have an altogether not show content labeled as sensitive by iOS (similar to do not show #nsfw tagged posts)?
On Thu, Oct 05, 2023 at 07:04:08PM -0700, alltheseas wrote:
reverse expected behavior.
This is a fair concern.
Further thinking I’m inclined to have some way to show the user that the iOS system Sensitive Content settings are enabled or disabled in this view, and in this fashion the behavior would be more up front.
I like this approach.
Question for you, @robagreda, and @jb55 - do we want to have an altogether not show content labeled as sensitive by iOS (similar to do not show #nsfw tagged posts)?
no we should show it but just blur and label as "sensitive content" and show a button to unblur like Twitter does it.
To clarify - I was suggesting a way to notify the user that image content can be screened with the system filter, and I think that we just show or hide the options based on whether the system settings support.
One more thing - sometimes the content filter takes a while to determine if isSensitive
is true/false, and we can show a loading or temporary view there.
Most things are screened properly, but others still slip by the settings, unfortunately.
Need to work on profile pictures.
On Sun, Oct 08, 2023 at 02:12:29PM -0700, jerihass wrote:
One more thing - sometimes the content filter takes a while to determine if
isSensitive
is true/false, and we can show a loading or temporary view there.
it should probably be a part of the preloading logic then. We preload stuff before it scrolls into view.
On Sun, Oct 08, 2023 at 02:12:29PM -0700, jerihass wrote: One more thing - sometimes the content filter takes a while to determine if
isSensitive
is true/false, and we can show a loading or temporary view there. it should probably be a part of the preloading logic then. We preload stuff before it scrolls into view.
@jerihass do you need help with the preloading logic?
@jerihass do you need help with the preloading logic?
Not quite yet - I’ll dig into the kingfisher stuff and see if I can start analyzing the images as soon as they finish downloading and then store the isSensitive
result.
@alltheseas The static and animated images are at a 95% solution; needs some refactoring. One thing I am unable to figure out is what is causing some images to not be analyzed. Initially I thought the analyzer is taking a long time, but now I think that some images are not going through the pre-loader and are missing the analyzing process, but I'm not sure where in the codebase this is happening. Is there a chance some images get called with Kingfisher without being called in the preload image function. https://gist.github.com/jerihass/99663a5489fc60c169bb2dac3700fd88
@jb55 do you recall who helped with kingfisher?
I wonder how iOS sensntive content detect performance compares to dignifai https://github.com/damus-io/damus/issues/1981
user story
As a Damus user that browses nostr, I would like to have the option of Damus detecting nsfw images, so that I do not see NSFW content on my device.
acceptance criteria
design
@robagreda does it make sense to "flag" nsfw content, and e.g. blur image, or to not show nsfw identified content altogether?
Might this be two settings: 1) identify and blur nsfw content, and 2) do not show nsfw content?
Apple implementation reference
https://developer.apple.com/documentation/sensitivecontentanalysis
origin: https://github.com/damus-io/damus/issues/910#issuecomment-1731633193
Related
NIP-36 https://github.com/damus-io/damus/issues/910