mikf / gallery-dl

Command-line program to download image galleries and collections from several image hosting sites
GNU General Public License v2.0
11.38k stars 929 forks source link

automatically downloading MEGA links via metadata option #2261

Open ZenythFactor opened 2 years ago

ZenythFactor commented 2 years ago

Will there even be a passive option where, lets say you're scraping off someone's Twitter or Patreon, you automatically queue and download whatever link the might have put on the description? Specifically a MEGA.co.nz link?

I have a Patreon that I'm following and learn they're closing everything down in a few days with the possibility they're severing off MEGA links to HQ images and archives as well. It'll also be a bit tiring and time-consuming to go through every link to save what's going to be gone as well.

a84r7a3rga76fg commented 2 years ago

I don't think that's a good idea because MEGA will ban you after you've downloaded a few gigs, it's better to get the link from the description and use another program that can bypass those MEGA bans with proxies.

ZenythFactor commented 2 years ago

Yeah, but I don't want to sit through maybe 150 or less Patreon/fanbox/twitter posts that may contain MEGA or even Dropbox links or so.

Maybe put each MEGA download on queue and halt it when it's past that limit to download to avoid bans or so. What's the source on the banning anyways? I doubt it matters when you're on a certain subscription or so.

rautamiekka commented 2 years ago

MEGA will ban you after you've downloaded a few gigs

5GB yes, and after you use that, you can get 1GB every 6h => another 5GB after 30h (1d 6h). It would be possible to save the timestamp to calculate how much one can download depending on their subscription, but I can already tell that'll take some thinking.

Maybe put each MEGA download on queue and halt it when it's past that limit to download to avoid bans or so. What's the source on the banning anyways? I doubt it matters when you're on a certain subscription or so.

At least if you can believe https://www.cloudstorageoptions.com/mega/ (they use Anti-Adblock, though), you got as much bandwidth to use as you got storage space, except for the 5GB for the free tier's 50. gdl already does the waiting game with, for ex, DeviantArt, when the API returns a certain complaint. Leaving MEGA on the side while waiting for the timeout to pass should be possible.

ZenythFactor commented 2 years ago

MEGA will ban you after you've downloaded a few gigs

5GB yes, and after you use that, you can get 1GB every 6h => another 5GB after 30h (1d 6h). It would be possible to save the timestamp to calculate how much one can download depending on their subscription, but I can already tell that'll take some thinking.

Maybe put each MEGA download on queue and halt it when it's past that limit to download to avoid bans or so. What's the source on the banning anyways? I doubt it matters when you're on a certain subscription or so.

At least if you can believe https://www.cloudstorageoptions.com/mega/ (they use Anti-Adblock, though), you got as much bandwidth to use as you got storage space, except for the 5GB for the free tier's 50. gdl already does the waiting game with, for ex, DeviantArt, when the API returns a certain complaint. Leaving MEGA on the side while waiting for the timeout to pass should be possible.

I have the Pro I subscription, so this isn't a hassle for me. It's not like I'm downloading several gbs of content but a bunch, and a bunch, of links to grab and immediately download once it's sniffed out.

mikf commented 2 years ago

There is no completely automated way of doing this, but you could do something similar https://github.com/mikf/gallery-dl/issues/2246 and only write any content that contains mega.co.nz links. Combine this with --no-download --no-skip any get a bunch of files with mega links.

AlttiRi commented 2 years ago

1GB every 6h

Do you not use an account with Mega Desktop App? Upd: I see, it's a text from the article.


Also it would be difficult to do the automatic parsing reliable. Some URLs can be with a separated decryption key.

If you going to collect them manually it's better to concatenate all descriptions in one text/html file, then parse the links from the file, then put links in Desktop App ("Open links" menu item).


BTW, when you open a video on the site version you spend your download quota (it's used to load the video for watching).

ZenythFactor commented 2 years ago

There is no completely automated way of doing this, but you could do something similar #2246 and only write any content that contains mega.co.nz links. Combine this with --no-download --no-skip any get a bunch of files with mega links.

Does this involve actually "downloading" from the links? That's what i need, GDL can already grab MEGA links via metadata and this similar way could do the same. If this was actually answered, sorry if i misread/misinterpreted that.

rautamiekka commented 2 years ago

1GB every 6h

Do you not use an account with Mega Desktop App? Upd: I see, it's a text from the article.

I do. The article is about right too cuz when I downloaded way more than 5GB through the desktop app, it took 6h to be able to download again, so I had to leave it until the next day to get another sizable amount.

mikf commented 2 years ago

Does this involve actually "downloading" from the links? That's what i need, GDL can already grab MEGA links via metadata and this similar way could do the same. If this was actually answered, sorry if i misread/misinterpreted that.

MEGA is not yet supported by gallery-dl (#1624) and the metadata post processor can not add additional download URLs to the queue.

You might want to use your own script via exec that feeds MEGA links into megatools or something.

afterdelight commented 1 year ago

does megatools support file archive?

kattjevfel commented 1 year ago

no

afterdelight commented 1 year ago

what mega scripts support file archive? is there any? I want to move downloaded files to other folder and skip the files on next run.

kattjevfel commented 1 year ago

You'll have to write your own script, please do share it when/if you write it.

ZenythFactor commented 1 year ago

Hopefully a solution comes soon

afterdelight commented 1 year ago

here is a candidate to fork from https://github.com/pgp/mega.py