Open germyparker opened 1 year ago
What is the subreddit you tried to originally download, and the command you used? Would like to try this myself. If you are talking about r/ShitPoppinKreamSays, it just fails downloading anything since the links there are np.reddit.com links and bdfr has no proper downloading module for that. You could try scraping the log bdfr generates for these links though, collecting them in a file and downloading that using bdfr archive --include-id-file comments.txt --comment-context
. See #835 for some inspiration for how I tried that method. Some kind of hacking is probably required. Also note that #851 could cause some issues here. I check Github very infrequently, so a reply might take some time, but I hope my tips help somewhat!
First of all, I'm not sure how to submit this as a question, because I don't think it's a bug, and "[SITE]" seemed like the best option...?
This might be a weird question - but -
I'm trying to download an entire subreddit which consists only of links to comments in other subreddits. I'm hoping to get the single comment the link goes to (ideally actually the entire thread that follows, but beggars can't be choosers).
However, instead of getting a md file with the comment, I'm getting the content of the OP of the linked thread. Does that make sense?
Alternatively: The subreddit reposts comments by one specific user, so an alternative is to just download everything that user has ever said. This is sub-optimal for several reasons: 1, not every comment is useful/interesting, the subreddit is just the good ones, and 2, after about 30 posts, I get the following error:
Here's the command I'm using:
and the full error:
Finally, I think this is the post it's failing on:
https://old.reddit.com/r/reddevils/comments/146eg1s/brandon_williams_rant_roudup/jnqnprn/
I'm using the latest version via pip, updated last week.
To reiterate: I would much prefer a solution to the initial problem, if there is one: how to download posts that are links to comments.