Closed GoogleCodeExporter closed 9 years ago
Ok, so after trying around, I found that they have a hidden .zip link, so the
add-on DownloadHelper (for Firefox) can download that file.
Can these imagesets be supported by Graber as well?
Original comment by JarieSui...@gmail.com
on 14 Jul 2014 at 6:14
Also found that if they have a higher resolution, that has to be loaded before
DownloadHelper can see it.
Original comment by JarieSui...@gmail.com
on 14 Jul 2014 at 6:26
Ok, so I've found it DOES work with them (hadn't realized it would download
.zip files), but it tends to have trouble managing to download them.
Also, just to make sure you know, it seems that they usually have a
"fullscreen" download zip, which has the full quality images in them for sure;
not totally sure if they always have that link though.
An example link:
http://i2.pixiv.net/img-zip-ugoira/img/2014/07/14/09/29/48/44703147_ugoira1920x1
080.zip
Original comment by JarieSui...@gmail.com
on 14 Jul 2014 at 6:58
I just added "animation array" support, and should work just fine. What kind of
trouble do you mean?
Tested on picture you linked, downloaded 3.5mb zip archive.
Grabber downloads 1920x1080 by default.
P.S.: even if there is 1920x1080 prefix, it does not mean it's really that big.
As example, original size for picture from your link is 350x450.
P.S.S.: I can't make GIFs from them, because if you will try to do it with jpg
pictures, you will lose in quality dramatically.
P.S.S.S.: it does mean that PIXIV doesn't use any real animation formats, it
just shows pictures with interval using javascript.
Original comment by catgirlfighter
on 14 Jul 2014 at 8:37
>I just added "animation array" support
I mean, last friday :)
Original comment by catgirlfighter
on 14 Jul 2014 at 9:02
Thanks for clearing that up! Yeah, I started to realize that after a while...
my first times coming across them in Graber, it timed out every time, so I
assumed there was an issue. But, after a few more tries, it got them. They're
just so much bigger files sometimes, that it just can't finish them.
Can a buffer be added, or something, so that an incomplete download can be
resumed? I don't know if that would really be a good idea though, since that
could also really clog up the system...
Either way, I now understand it, and am fine with it as it is. Thank you!
Original comment by JarieSui...@gmail.com
on 14 Jul 2014 at 4:22
You can't continue downloading from the most of resources (HTTP servers do not
allow partial/position based start), so I didn't implement it. I.E. I tried it
before.
Original comment by catgirlfighter
on 15 Jul 2014 at 9:27
Actually, it's not a bad idea to TRY resume it, if by any means it can't be
resumed, just restart as usual.
Original comment by catgirlfighter
on 15 Jul 2014 at 9:31
>it's not a bad idea to TRY resume it
when retry count is more then 0 obv.
Original comment by catgirlfighter
on 15 Jul 2014 at 9:32
The feature, that will try to resume downloading, was implemented. Hope it will
work fine.
Original comment by catgirlfighter
on 23 Jul 2014 at 1:05
Original comment by catgirlfighter
on 24 Jul 2014 at 9:35
Original issue reported on code.google.com by
JarieSui...@gmail.com
on 14 Jul 2014 at 1:55