Closed Sozialarchiv closed 1 year ago
Not really, but it is trivial thing. If file is small, you can read all at once and hash it with one shot function.
If file can be big, you should use streaming algorithm by feeding data chunks after every read like following:
let mut hasher = xxhash_rust::xxh3::Xxh3::new();
let mut file = std::fs::File::open("Cargo.toml").expect("open file for chunk reading");
let mut buf = [0u8; 10]; //use bigger size in real world (e.g. 4096 bytes)
loop {
match std::io::Read::read(&mut file, &mut buf) {
Ok(0) => {
println!("end of file reached");
break;
}
Ok(len) => {
println!("File chunk {len} bytes");
hasher.update(&buf[..len]);
}
Err(error) => match error.kind() {
std::io::ErrorKind::UnexpectedEof => {
println!("unexpected end of file reached");
break;
},
_ => panic!("error reading chunk: {}", error),
}
}
}
let hash = hasher.digest128();
println!("file hash {}", hash);
let data = std::fs::read("Cargo.toml").expect("read full data");
let expect_hash = xxhash_rust::xxh3::xxh3_128(&data);
assert_eq!(expect_hash, hash);
Thank you very much for this example and the fast answer.
Thanks a lot for this great library.
Is there an example with a file input with a buffer (stream) available?