Xerbo / furaffinity-dl

FurAffinity Downloader, now with 100% more Python
BSD 3-Clause "New" or "Revised" License
90 stars 17 forks source link

Please put feed back for the Python version on this thread #38

Open Xerbo opened 4 years ago

Xerbo commented 4 years ago

It's finally happening, when finished all current code will be moved to a new branch and the python version will occupy the master branch.

All existing functionality will be migrated, this would also allow native windows support and easier development.

DallasWhite commented 4 years ago

Would this be why the current script isn't working? The last time I successfully used it was before this announcement, but when I went to use it today, I couldn't get it working.

Xerbo commented 4 years ago

This doesn't seem to be a problem with the script but rather FurAffinity, when quiet mode is turned off in wget it exits with 2020-02-27 22:09:03 ERROR 503: Service Temporarily Unavailable.

Which most likely means that FurAffinity have blacklisted the user-agent of this script or it's failing some sort of CloudFlare anti-bot test.

Manni1000 commented 4 years ago

when can i downlode the python version?

serveral1 commented 4 years ago

any plans on adding the -n function (or something better) back? i just found it very useful whenever i needed to update someone's gallery based just on their latest submissions.

Xerbo commented 4 years ago

Would this be a good enough replacement?

https://github.com/Xerbo/furaffinity-dl/blob/071e8692adc6dbd8a4109c681cbd02cd0b780a90/furaffinity-dl.py#L16

reederda commented 4 years ago

When downloading favorites, the python tool indicates "Downloading page " when continuing to the next page, but appears to fail and instead just cycles through the first page repeatedly.

Xerbo commented 4 years ago

@jkmartindale kindly fixed this in #43

felikcat commented 4 years ago
-a attempts, how many connection retry attempts before exiting; -1 for unlimited, ? is default.
-t timeout, wait this long in seconds before another connection retry attempt; ? is default.
Xial commented 3 years ago

I would love a way to insert a pause between downloads, something like five to fifteen seconds, to ease up on how much I'd otherwise be hitting them.

By default, it just feels like it hits the server a bit too fast overall, especially as I would like to get back to the long term local archiving habit I have had, and don't want them to just block me for getting caught up on going through my archives.

Also, is it possible to also place the created JSON files in a subfolder, or otherwise not keep them once downloading is finished?

Xerbo commented 3 years ago

I've downloaded some large (>2000 submissions) galleries and have had no problems with rate limiting yet, would still be a good idea to add a delay though. As for putting the meta files in a different directory, good idea, would really clean up the output folder.

I'll get on this tomorrow, should be pretty easy to do.

Xerbo commented 3 years ago

@Xial done, see 0f0fe3e6 and 85cd3cd.

Xial commented 3 years ago

Looking forward to giving those a go later today. Thank you! :)

Xial commented 3 years ago

Would this be a good enough replacement?

https://github.com/Xerbo/furaffinity-dl/blob/071e8692adc6dbd8a4109c681cbd02cd0b780a90/furaffinity-dl.py#L16

Perhaps an option to only process a specific number of pages would be appropriate. Occasionally, I might have saved one or two images by hand from the recent stuff, but then notice that the artist just sprayed 30 or 40 pictures up all at once and realize it'd be better to just automate the process.

It could also mitigate things like this when refreshing a gallery:

... Skipping "Bea", since it's already downloaded Downloading page 10 Skipping "Golden Birb", since it's already downloaded ...

:)

ponchojohn1234 commented 3 years ago

i'm wondering if it would be posible to download a specific folder from a user instead of their entire gallery, i know an old browser extension was able to do it but it doesn't work with the new theme

Xial commented 3 years ago

I noticed there's a --start command that allows users to pick a page number to start from. Could there be a --stop-at command, so that once the page count goes beyond a certain point, the script stops downloading?

As example, there's one user who has uploaded lots of art over the years, and having the script go through 30+ pages to announce that it's skipping because file's already present makes me feel like a bad citizen. However, downloading 60 images by hand for one artist is a little tedious.

Thanks. :)

Radiquum commented 2 years ago

I just want to say: Thank you for this tool, it's so much easier than doing it manually for over 3k+ various images :3

and yes, it works on: arch linux and android (termux) with python 3.10.x