Why we're doing this:
Previously, if any image was NSFW, the entire batch would be discarded, which was frustrating if only one image was problematic. Now, we handle this more intelligently.
What the code was doing:
Generating all images
Checking if any image was NSFW
If yes, discarding everything and notifying the user
If no, returning all images
What it does now:
Generate all images
Check each image for NSFW content
Keep the safe ones, discard the inappropriate ones
Inform the user how many images were NSFW
If all images are NSFW, then notify the user
Return only the safe images
This change makes our image generation more useful and less frustrating for users, allowing them to receive more usable images even if some in the batch were inappropriate.
Title: Improve NSFW Image Handling
What this does:
Why we're doing this: Previously, if any image was NSFW, the entire batch would be discarded, which was frustrating if only one image was problematic. Now, we handle this more intelligently.
What the code was doing:
What it does now:
This change makes our image generation more useful and less frustrating for users, allowing them to receive more usable images even if some in the batch were inappropriate.