Closed naruhina2016 closed 7 years ago
You need to have date in your Filename. OR search for your last downloaded pic on Sankaku Channel (e.g. with iqdb.org or md5 hash) and look when it was uploaded. After that, start downloading till you reach page 1000. In my example we are using yuri as tag. Once you're done, look which date the last pic that was downloaded has. Mine is 26.07.2014. Then I start a new Batch Job using this as tag yuri date:<=27.07.2014 Now Sankaku Complex will download all images which were uploaded before 27.07.2014 ;) replace it with your data
thanks but i wasn't talking about danbooru. i,m using image grabber now.
lol you're the same person from back then. I also switched from danbooru downloader. But this method works with both programs
Do you mean you cannot download more than 1,000 images or you cannot download images past page 1,000?
From your message, it seems to be 1,000 pictures but I know that Sankaku has a 1,000 pages limitation, which would limit the images to 20,000. There was actually a way to bypass this in previous Grabber versions but it seems it was broken by the 5.2.0 update.
PS: you can try to re-enable this bypass by setting packing_enable
to false
in your settings.ini
file. It will stop using packing during download, so when it reaches page 1,001 it will use smart paging and use ?prev=
and ?next=
instead of ?page=
(only works in HTML mode).
BTW, is there any specific reason you switched from Nandaka's Danbooru Downloader? (so I can know what to further improve and what are good points of Grabber)
awesome I didn't know that :o Danbooru Donloader is totally inferior to grabber. If not using sankaku complex, you have to download a shit ton of tags or the pics won't contain any. And sankaku didn't work properly for me. You're program has soo much more features (like being able to display specific tags in filename) and is really sophisticated, that's why I like perfectionists. But I'm really thankfull for his pixiv downloader :D
bionus the reason why i switched from danbooru downloader to grabber bcause login cookies of sankaku complex always stops randomly and later on when reaching 1000 pics page 50 or so
how can i do the packing_enable to false in your settings. and which program danbooru downloader or image grabber?
Here's Grabber's issue tracker, so we're talking about this program. I have no idea how Nandaka's Danbooru Downloader works.
You can edit your settings.ini
file (in C:/Users/%USERNAME%/AppData/Local/Bionus/Grabber
) and add a line just after [General]
containing packing_enable=false
.
It works thank you...xD
I'm confused. Where are the 1000 pages from? The limit is 50 pages, after that you need special indexing.
Also is there a way to make it continuously download? Seems pretty weird that it has to first visit all pages and only then start downloading, pretty much means you gotta stop at some reasonable number. What if you want everything in order:quality, or a couple million?
I'm confused. Where are the 1000 pages from? The limit is 50 pages, after that you need special indexing.
If you're logged in, it goes up to page 1,000 from what I remember. Else, it's indeed 50 pages, but the "special indexing" you mention is supported by Grabber and it will switch to it as long as you also download the page before (as with it, it's required to load page 50 to get results from page 51).
Also is there a way to make it continuously download? Seems pretty weird that it has to first visit all pages and only then start downloading, pretty much means you gotta stop at some reasonable number. What if you want everything in order:quality, or a couple million?
That's what packing does, it starts downloading images every time Grabber has n
images to download. It's set to 1,000 images by default, so it downloads pages by packs of 50, but if you really want to lower it it's possible. But I don't really see a reason to do it.
did something happened to grabber? bcause it wont download complete set of pics that comes from sankaku complex
Maybe you can give more context, because I just finished downloading a bunch of images from Sankaku.
acerola from pokemon has 439 pics on sankaku and yet it only download 273 of 439 that never happened until today after new update so what should i do?
What steps will reproduce the problem?
sankaku complex wont download over 1000 pictures
What is the expected behavior? What do you get instead?
nothing it just wont download more
How often does this problem occur?
all the time i guess?
What version of the program are you using? On what operating system?
5.2.3 on win 10
Please provide any additional information below