stashapp / stash

An organizer for your porn, written in Go. Documentation: https://docs.stashapp.cc
https://stashapp.cc/
GNU Affero General Public License v3.0
9.09k stars 788 forks source link

[Feature] Making stash "offline" #2848

Closed watermelonpinecone closed 2 years ago

watermelonpinecone commented 2 years ago

Is your feature request related to a problem? Please describe. I don't like the idea of web services being scraped for NSFW information, nor receiving any specifics about what's stored in my library.

Describe the solution you'd like I would like to be able to run stash an "offline" mode of sorts, where no scraping nor querying takes place.

Describe alternatives you've considered I've gone through the code, issues, and PRs and not found any such functionality or similar request.

Additional context I would add an additional config option for this. I can see I would want to disable scrapers, probably by returning prematurely here with an empty scrapers. Stashbox doesn't appear to operate until/unless config is supplied by the user. Are there any other sources of external requests I should be looking into please?

Thank you!

ALonelyJuicebox commented 2 years ago

Simply do not add any scrapers to accomplish this goal. Stash is effectively offline by default. Stash has no native built in scrapers. Further, you have full ability to dictate how much access any application has to the internet using your OS's built in firewall.

Windows for example https://www.howtogeek.com/227093/how-to-block-an-application-from-accessing-the-internet-with-windows-firewall/

watermelonpinecone commented 2 years ago

Thanks @ALonelyJuicebox. Would freeones not count as a built-in scraper? It appears to be unconditionally initialized here.

7dJx1qP commented 2 years ago

Thanks @ALonelyJuicebox. Would freeones not count as a built-in scraper? It appears to be unconditionally initialized here.

The initialization is just loading a built in freeones scraper config file. AFAIK, no querying or scraping takes place. The config just specifies how a scrape/query would be parsed.

It's also only a performer scraper, so if you never click the "Scrape with..." button or the Scrape button next to the url input when creating performers, it will never be used.

cjemorton commented 2 years ago

You could run it inside of a jail or a VM and then filter and use a firewall to block all traffic except to the machine your using to view it on.

WithoutPants commented 2 years ago

As mentioned above, the freeones scraper is only used if you explicitly scrape performers with it. I don't really see the value in disabling it. Closing as invalid - feel free to comment if I've missed something.

watermelonpinecone commented 2 years ago

Sorry for my misunderstanding, thank you for setting me right!

I plan to try running it with tcpdump and report anything unexpected since I've wasted your time here.