Closed afterdelight closed 4 years ago
I don't plan on adding any more features to this script, it is solely for downloading stories only. Use the code on this gist to download posts. https://gist.github.com/notcammy/bb440f7ab615017f62f3f0c87eb7cc2f
@dvingerh Hi, can you share the gist again? I lost the code. Thanks
The gist still exists, only my username has changed 👍 https://gist.github.com/dvingerh/bb440f7ab615017f62f3f0c87eb7cc2f
Oh ok, thanks !
@dvingerh
Hi, I got an error code from pyigposts.py while downloading. This problem doesn't exist with pyigdumper.py. Probably because of a different module used.
An error occurred: HTTPSConnectionPool(host='scontent-sin6-1.cdninstagram.com', port=443): Max retries exceeded with url: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
(Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000XXXXXXXXXXX>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond'))
Do you know what causing this?
The script is running fine here, I can't tell you the exact cause, looks like Instagram's servers are not responding to the download request (as the error suggests). I suggest trying again at a later time
It happens inconsistent and randomly on some posts :S
When it happened with igpystories.py it resume the downlaod and ignore the last error and skip to next stories so the downlaod proceed but with igpyposts.py the download stuck and the program terminated when it counters error. What code should I add/change to igpyposts.py so it can ignore error and resume download like igpystories.py ?
Try this one it has retry/skip like the script in this repo does
https://gist.github.com/dvingerh/2745b62f46cf2f20852a4393c8c72f8c
I commented out story downloading on L446 but if you want it to do both story and posts uncomment it. Untested but should do the trick
As for the failing downloads you could just have a flaky internet connection
With the new code it stops at the first user and doesn not continue to the next users.
Oh yeah i didnt know it had to be able to iterate through users Added multiple user and batch file support
https://gist.github.com/dvingerh/2745b62f46cf2f20852a4393c8c72f8c (redownload the script)
Okay, now it works like it intended to, thank you again.
There is a bug in the new pyigdumper.py the posts mixed randomly between usernames folders some of posts user a and user b land in user c folder some of posts user d and user e land in user f folder The folders become a mess and cause confusion which photos belong to whom.
Fuck i forgot to add the proper line for that to avoid it this should be working now
https://gist.github.com/dvingerh/2745b62f46cf2f20852a4393c8c72f8c
Well, hopefully it's working fine now. Thanks again :)
@dvingerh Hey, I tried using both pyigposts.py & pyigdumper.py to download justinbieber pictures, but the download stops at 21 posts when in fact he had over 5K posts.
Is there something wrong with it or it's meant to be that way?
Thank you.
@sherl0ck17 This is intended, if you need a scraper that's actively maintained and can scrape entire profiles I recommend looking into other solutions such as https://github.com/arc298/instagram-scraper
@sherl0ck17 This is intended, if you need a scraper that's actively maintained and can scrape entire profiles I recommend looking into other solutions such as https://github.com/arc298/instagram-scraper
How & why does it stop at the 21st photo? What is the reason behind this coding?
Scraper meaning it helps to download all photos from an account?
Is there a way for you to code to be able to bypass a private account and be able to download both stories & posts of that particular user?
How & why does it stop at the 21st photo? What is the reason behind this coding?
Because it only retrieves the first "page" of the user feed and I haven't implemented pagination (meaning going through the entire user feed) since I didn't need functionality like that.
Scraper meaning it helps to download all photos from an account?
Yes
Is there a way for you to code to be able to bypass a private account and be able to download both stories & posts of that particular user?
Viewing the contents of private accounts is not possible to bypass at this time
How & why does it stop at the 21st photo? What is the reason behind this coding?
Because it only retrieves the first "page" of the user feed and I haven't implemented pagination (meaning going through the entire user feed) since I didn't need functionality like that.
Scraper meaning it helps to download all photos from an account?
Yes
Is there a way for you to code to be able to bypass a private account and be able to download both stories & posts of that particular user?
Viewing the contents of private accounts is not possible to bypass at this time
Hey man, thanks for the prompt reply and you replying to all my enquiries!
Do update me if ever there's a way to bypass private accounts! You've got my email.
Much appreciated, man! 💯👍
Please add post download like in your personal branch. Thank you!