Bionus / imgbrd-grabber

Very customizable imageboard/booru downloader with powerful filenaming features.
https://www.bionus.org/imgbrd-grabber/
Apache License 2.0
2.55k stars 218 forks source link

Request for Improvement: Add default for `-m` to grabber-cli #2911

Open dazey3 opened 1 year ago

dazey3 commented 1 year ago

Description

I'm new to using imgbrd-grabber, so forgive me if I'm just doing something wrong. I am trying to download using the command line, but I can't seem to download anything.

If I run the command:

    > grabber-cli -t "rating:safe wallpaper" -s "e621.net" --return-count

I get the output:

2340

However, if I run the command:

    > grabber-cli -t "rating:safe wallpaper" -s "e621.net" --download

The command returns. There is no output and I see nothing downloaded.

Checking main.log:

[01:06:40.747][Warning] Javascript 'model.js' file not found for 'Gfycat' in `[...]/Grabber_v7.10.1_x64/sites/Gfycat/model.js`
[01:06:40.780][Info] Temp directory purged of 0/1 files (0 failed)
[01:06:40.780][Error] The image limit must be more than 0

Any pointers?

System information

ari-party commented 1 year ago

Tell me if this happens when you use the GUI version with the same tags and site.

dazey3 commented 1 year ago

Tell me if this happens when you use the GUI version with the same tags and site.

@Astrism It works perfectly fine with the GUI


Also, it may be worth noting that

    > grabber -c -t "rating:safe wallpaper" -s "e621.net" --return-count

seems to do nothing, even though grabber-cli does at least return a count, even if it doesn't download anything

main.log (grabber -c -t "rating:safe wallpaper" -s "e621.net" --return-count):

[22:47:57.383][Warning] Javascript 'model.js' file not found for 'Gfycat' in `[...]/Grabber_v7.10.1_x64/sites/Gfycat/model.js`
[22:47:57.416][Info] Temp directory purged of 0/1 files (0 failed)
[22:47:57.416][Info] [e621.net][Html] Loading page `https://e621.net/posts?limit=20&page=1&tags=rating%3Asafe wallpaper`
[22:47:58.003][Info] [e621.net][Html] Receiving page `https://e621.net/posts?limit=20&page=1&tags=rating%3Asafe wallpaper`
[22:47:58.009][Info] [e621.net][Html] Parsed page `https://e621.net/posts?limit=20&page=1&tags=rating20Asafe wallpaper`: 20 images (0), 493 tags (-1), 2340 total (117), 117 pages (%11)

except nothing prints... and this is the same exact main.log print as grabber-cli -t "rating:safe wallpaper" -s "e621.net" --return-count, but grabber-cli at least prints a result

ari-party commented 1 year ago

Does it error immediately or after so many seconds?

dazey3 commented 1 year ago

Does it error immediately or after so many seconds?

@Astrism It takes a few seconds, then the command line returns and nothing was printed. Or in the case of downloading, it takes a few seconds, then the command line returns and nothing was downloaded. It'd be REALLY nice if there was some indication of a failure, but the CL just silently does nothing.

Bionus commented 1 year ago

It's supposed to print it as well, but isn't there a pretty clear error in your log?

The image limit must be more than 0

Make sure to pass an image limit to the CLI, IIRC that's the -m/--max argument. The link between "limit" and "max" could be made more explicit though, I guess.

dazey3 commented 1 year ago

Are these supposed to be functionally the same?

C:\...\Grabber_v7.10.1_x64>grabber -c -t "rating:safe wallpaper" -s "e621.net" --return-count

C:\...\Grabber_v7.10.1_x64>grabber-cli -t "rating:safe wallpaper" -s "e621.net" --return-count
2340

C:\...\Grabber_v7.10.1_x64>

main.log

[20:36:08.594][Warning] Javascript 'model.js' file not found for 'Gfycat' in `C:/.../Grabber_v7.10.1_x64/sites/Gfycat/model.js`
[20:36:08.626][Info] Temp directory purged of 0/0 files (0 failed)
[20:36:08.626][Info] [e621.net][Html] Loading page `https://e621.net/posts?limit=20&page=1&tags=rating%3Asafe wallpaper`
[20:36:09.209][Info] [e621.net][Html] Receiving page `https://e621.net/posts?limit=20&page=1&tags=rating%3Asafe wallpaper`
[20:36:09.225][Info] [e621.net][Html] Parsed page `https://e621.net/posts?limit=20&page=1&tags=rating20Asafe wallpaper`: 20 images (0), 493 tags (-1), 2340 total (117), 117 pages (%11)
dazey3 commented 1 year ago

It's supposed to print it as well, but isn't there a pretty clear error in your log?

The image limit must be more than 0

Make sure to pass an image limit to the CLI, IIRC that's the -m/--max argument. The link between "limit" and "max" could be made more explicit though, I guess.

@Bionus I'm sorry, are you saying that the -m argument is required to run the command

C:\...\Grabber_v7.10.1_x64>grabber-cli -t "rating:safe wallpaper" -s "e621.net" --download

I want to download everything though, by not passing a maximum I expected to get everything.

dazey3 commented 1 year ago

Running

C:\...\Grabber_v7.10.1_x64>grabber-cli -l "C:\...\imgbrd-grabber" -m 2340 -t "rating:safe wallpaper" -s "e621.net" --download

gives me this error:

[Error] You need a filename for downloading images

Can I not download to a location with -l? Also, by "filename" does the program mean "directory"? I'm not sure what the difference is between a "location" and a "filename" for this program. Nevermind, I found documentation here that explains what filename means: https://www.bionus.org/imgbrd-grabber/docs/filename.html

If I use -f, then I get the error that the directory already exists and nothing downloads.

[20:54:47.478][Info] File already exists: `\...\imgbrd-grabber`
[20:54:47.478][Info] File already exists: `\...\imgbrd-grabber`
[20:54:47.478][Info] File already exists: `\...\imgbrd-grabber`
[20:54:47.478][Info] File already exists: `\...\imgbrd-grabber`
[20:54:47.478][Info] File already exists: `\...\imgbrd-grabber`
[20:54:47.478][Info] File already exists: `\...\imgbrd-grabber`
ari-party commented 1 year ago

You need to set a filename. You can do %md5%.%ext% to quickly get off it (Don't trust me on this, I don't have the file name command.)

dazey3 commented 1 year ago

You need to set a filename. You can do %md5%.%ext% to quickly get off it (Don't trust me on this, I don't have the file name command.)

So, I was able to get things working with:

>grabber-cli -f "C:\...\imgbrd-grabber\%artist%\%md5%.%ext%" -m 10000000 -t "rating:safe wallpaper" -s "e621.net" --download

At the moment, it seems like -m is required, except I don't really understand why. If I want to download everything, why should I need to provide a maximum? Seems like the default should be --return-count, but I guess for now I'll just add some absurd amount to make sure I get everything.

ari-party commented 1 year ago

Is all a valid max download input? It is on the GUI.

dazey3 commented 1 year ago

Is all a valid max download input? It is on the GUI.

I tried -m all, -m ALL, -m "all", and -m "ALL". All of these result in:

[Error] The image limit must be more than 0
ari-party commented 1 year ago

Is all a valid max download input? It is on the GUI.

I tried -m all, -m ALL, -m "all", and -m "ALL". All of these result in:

[Error] The image limit must be more than 0

Throw it a big ass integer based on your CPU's architecture then.