Nandaka / DanbooruDownloader

*booru image downloader
http://nandaka.devnull.zone/
390 stars 38 forks source link

Sankakucomplex can't download after page 50 #111

Closed ashllay closed 6 years ago

ashllay commented 7 years ago

log [DoBatchJob] Downloading list: https://chan.sankakucomplex.com/post/index?tags=nier:_automata&page=51&login=heyned&password_hash="my hash" [DoBatchJob] Error Getting List (1 of 5): The remote server returned an error: (404) Not Found. Wait for 60s.

looks like sankaku have changed something now after page 50 the link chage page 50: /?tags=nier%3A_automata&page=50 page 51: /?next=5783805&tags=nier%3A_automata&page=51

Nandaka commented 7 years ago

you can change modify the DanbooruProvider.xml for sankaku as below:

  <DanbooruProvider>
    <Name>Sankaku Complex (HTTPS)</Name>
    <DefaultLimit>20</DefaultLimit>
    <HardLimit>1000</HardLimit>
    <Preferred>Html</Preferred>
    <QueryStringXml />
    <QueryStringJson />
    <QueryStringHtml>/?%_query%</QueryStringHtml>
    <Url>https://chan.sankakucomplex.com</Url>
    <UserName>login=required; pass_hash=required</UserName>
    <Password />
    <LoginType>Cookie</LoginType>
    <PasswordSalt>choujin-steiner--%PASSWORD%--</PasswordSalt>
    <PasswordHash />
    <!-- 2016-10-07 12:06 -->
    <DateTimeFormat>yyyy-MM-dd HH:mm</DateTimeFormat>
    <BoardType>Danbooru</BoardType>
    <TagDownloadUseLoop>true</TagDownloadUseLoop>
  </DanbooruProvider>

and maybe update the user-agent to follow what your browser use?

pwner151 commented 7 years ago

That didn't fix it for me.

Using RWBY as an example.

https://chan.sankakucomplex.com/?tags=rwby&page=50 You can view up to this point with changing the page# in the url up to 50. Changing it to 51 or above throws the error "Error: You can only view up to 50 pages of results this way".

If you go to page 50 and hit the next page button, the url changes to https://chan.sankakucomplex.com/?next=5691118&tags=rwby&page=51

It looks like they are using the first image that applies to that tag on the next page as the next=#.

Hopefully this helps!

Nandaka commented 7 years ago

Error: You can only view up to 50 pages of results this way

key in your cookie for sankaku, that message because the session is not authenticated.

Nandaka commented 7 years ago

image

pwner151 commented 7 years ago

Could you explain "key in your cookie for sankaku, that message because the session is not authenticated."? I don't follow your phrasing and google isn't being helpful.

Nandaka commented 7 years ago

from readme:

Q7: How to login to Gelbooru/Sankaku/get cookie value?
A7: Follow this step:
    1. Press F12 on your Chrome browser and select Network tab.
    2. Go to the booru site and login.
    3. Click one of the entry and copy the Cookie value from the Request Header. 
       For gelbooru, it should like this: user_id=<number>; pass_hash=<long string>
       For sankaku, login=<username>; pass_hash=<long string>;
    4. Paste the Cookie value to the Username field.
    5. Set Login Type to Cookie. Refer to http://i.imgur.com/rCCjnPs.png
pwner151 commented 7 years ago

Ahh, I did that. it didn't fix the issue, though. Also, as a side question, would it be "login=NAME; pass_hash=Hash" or "login=NAME pass_hash=Hash"?

I don't know how you managed to see page 51 without it throwing a fit or you hitting next page. (edited line)

If I type page 51 into the URL, I get this. 51 in url If I type 50, then hit the next page icon, I get this. 51 via next

I believe this is the same issue ashllay is having. The parser (is this the right term?) can't manually go from 50 to 51 (without tricks), since Sankaku seems to have changed their backend to discourage this kind of thing.

Hopefully I'm helping with relevant information to this issue. x.x

Nandaka commented 7 years ago

use ; to separate the field. I just copy and paste the whole cookie values to the provider xml though.

image

notice that it also have mode=view; auto_page=1; in the cookie, maybe this is why?

I'll check it if the application can get the next url

pwner151 commented 7 years ago

None of this is fixing the problem that I've been having, sadly. :/

Would it be better for me to send you my personal "installation", zipped? It'll also have my SC login, so you can see whats wrong there? (It might be that I haven't contributed to the site or something)

tikkariz commented 7 years ago

I have same problem with pwner151, and have had this for while (over 4 months). Tried do things what Nandaka said in this thread, but they are not working for me, it still gets stuck in page 51. http://i.imgur.com/B7XjjU1.png http://i.imgur.com/xcIx1YF.png Even donwloaded latest fresh version, but nothing, hoping something can be done soon.

SomeonefromArg commented 7 years ago

I'm having problems too trying to download after page 50, is there a way to provide an username and password without the cookie option?

rexii2300 commented 7 years ago

same issue here. Using cookie method doesn't help either

jasonmbrown commented 7 years ago

Im also still encountering this error. Including when i attempt to load any page past 50 in my Browser, Unless Autopaging is turned off, and I goto page 50 then click next page.

Browser and Downloader are both logged in.

I Might try looking at the source later and doing a crap fix for it, depending on the way its parsing.

Nandaka commented 7 years ago

Looks like I need to pass the last post id as part of the next parameter.

The only issue is only if you start the batch download on page 51++. the program doesn't know what is the last id.

EDIT: Looks like it allow page jumping 51++ as long you pass the next parameter.

Nandaka commented 7 years ago

try https://github.com/Nandaka/DanbooruDownloader/releases/tag/v3.20170922