Closed mgsk closed 11 years ago
:+1: I can take a look at this unless @joshua-hull is interested
aggrolite, I made a small change but haven't forked the repo:
print 'Extracted ' . @links . " images for download from r/$sub\n";
foreach my $img (@links) {
unless (-e "$sub/" . $img->{file_name}) {
print 'Downloading ' . $img->{url} . "\n";
$mech->get($img->{url}, ':content_file' => "$sub/" . $img->{file_name});
sleep INTERVAL;
}
else {
print "File already exists... " . $img->{file_name} . "\n";
}
Sorry, didn't mean to close the issue.
Added @mgsk 's code and pushed. Thanks for the suggestion. I bet imgur will appreciate us not chewing through unnecessary bandwidth.
:+1:
It would be useful if this script could download just those pictures that haven't been downloaded before.