jonelo / jacksum

A hash utility, est. 2002, FLOSS. 489 hash functions, HMAC support, cross platform, feature-rich, multi threaded. CLI and API. Recursive hashing, predefined and customizable formats, verify data integrity and find ok/failed/missing/new files, find files by their hashes, find the hash function to a hash. GUI provided by HashGarten.
https://jacksum.net
GNU General Public License v3.0
43 stars 5 forks source link

Feature Request: Option to include filename in output (-o filename.txt) #3

Closed lexterror closed 3 years ago

lexterror commented 3 years ago

It would be great if it could print the output for each individual file in a separate .txt file containing the hash!

Thank you!

jonelo commented 3 years ago

Interesting feature request, but what's wrong with one check file containing all information for performing a later integrity verification? Actually I am wondering what the benefit is to have the file system cluttered with hundreds or even thousands of small hash files? It could become also difficult to get rid of those hash files again if you don't need them anymore. Also, if you hash a folder where you only have read permissions, where to store those hash files then?

Please don't get me wrong, I just would like need to understand the actual use case and I believe this needs a bit of discussion or an enlightment so that I can see the benefit as well ;-)

jonelo commented 3 years ago

or maybe I completely misunderstood your feature request (which is very likely) ?

lexterror commented 3 years ago

Hi,

Your program is very powerful! I actually use quite often:

https://www.binaryfortress.com/HashTools/

It has this feature but only supports a limited amount of hash types that are kind of outdated. Its also able to recursively compare the individual file against the hash for that file making it really easy to test file integrity! You could try it out and see if you like it! I personally love it!

lexterror commented 3 years ago

Example:

-o *.sha3-512

This would recursively create a *.sha3-512 for every file!

myfile1.doc would have a myfile1.doc.sha3-512 file with the hash!

And then you could something like:

-check *

And it would then compare (recursively) any file with the *.sha3-512 of the same name!

And hopefully give an "ok" after each comparison!

To me this seems logical and of course a very fast way to test file integrity! I dont see a point in visually comparing a hash number unless you are double checking! Its practical and saves time!

jonelo commented 3 years ago

Thanks for elaborating on this issue.

Well, Jacksum already supports file integrity checking, and that is a very easy process actually.

  1. Calculate the Message-Digest SHA3-512 of all files in the current working directory (.) and it's sub folders, and stores output to all-my-hashes.txt
jacksum -a sha3-512 -E hex -o all-my-hashes.txt .
  1. Verify all hashes stored in the all-my-hashes.txt file, and find all new files that are being stored anywhere in the diretory tree below the current working directory.
jacksum -a sha3-512 -E hex -c all-my-hashes.txt .

Is it still on your wishlist to have a separate hash file for each hashed file?

lexterror commented 3 years ago

Its good enough for me! Thank you for your time! Its really a great program! Im always interested in new things! Thank you!

Alepod commented 3 years ago

@lexterror You could also use CMD's for command to achive same effect.

lexterror commented 3 years ago

Great! Thank you Alepod!

jonelo commented 3 years ago

Thanks folks. For just a few files the scripting idea would be an acceptable choice in order to cover that corner case.

Please let me elaborate why I don't like to code that use case in Jacksum, because there are good reasons. The advantages to have exactly one hash file (and not a hash file for each hash) are:

IMHO having separate hash files for each file is a bad design decision for a data integrity tool. And just because there is one Windows tool on the web that does it that way, does not necessarily mean that it is the right thing to do. Just my 0.02 Cents.

If there are any good reasons/use cases that I have ignored, please don't hesitate and fire away, and I am happy to think about it twice or even third times. For now, I just don't see any good reason to have separate hash files for each file hashed.

Thanks for reading, Johann

lexterror commented 3 years ago

Hi Jonelo!

I really don't mind using 1 file for storing hashes. I did notice that if the file structure isn't the same later that the command used to compare doesn't find certain files! It could be that I'm missing commands? It would be nice to be able to just check recursively even if the folder structure isn't the same! But it's no rush, it's really fantastic there are so many possibilities! Thank you! At the moment I have this comic book collection on my pc that uses a program called "Neeview". On Android I have the same collection but I'm using a program called "Tachiyomi" and it requires a different folder structure. I also read some description online a few years ago saying that it was a 'standard' to have '*.sha512' for example... It could be false advertising!

lexterror commented 3 years ago

Using the 1 file method: I think this would work for a new verify command:

[1] Recursively load into RAM the name of all the files (including files in subfolders) [2] Load into RAM all the file names listed in the "all-my-hashes.txt" [3] Filter out non matching file names [4] Compare the hashes for the file names left

*This could be a new verify command for non conforming folder structures