Closed ytoaa closed 1 year ago
Hi, If the filenames changed by server, there is no way for scripts to aware that. So, I would also recommend you delete those duplicated files by some other scripts, since it will be extremely slow down the scripts by hashes every files inside directories even download for one files.
The another method to mitigate this is to use --archive
directive, once enable this, the script won't download the post that already downloaded before.
Maybe I will implement something like "duplicate file finder" or something like this feature by creating a separated directive or integrated scripts in the future, but it still highly recommended to achieve this by other professional tools.
Oh, okay. I'll remove the duplication through a separate script.
Thankfully, I'm using it well. However, if the file name changes when you download a file from the server, the hash check is meaningless. Of course, this can be solved by clearing the file name portion of the file pattern. However, in this case, it is not known which file it is. Is there any way? Now, we are double checking with a program called "alldup".