When updating a key, if the CRC32 of the payload and the length of the payload matches with the existing data and length, the update is skipped. This was on the original specifications to avoid non-needed. But after investigation, rely on CRC32 and length it not enough, it seems to be quite easy and probable that same CRC32 with same length occurs even on small dataset with different payload.
Since sha1 is implemented for secure challenge, it's possible to reuse the hashing function. When CRC32 and length matches (which should not happen often, depending on the use case), it would be more safe to fetch old data, check the hash and compute the hash of the new data and ensure it's the same or not.
There is no need of sha1, simple bytes compare of both data is faster since anyway, whatever hash algorithm we would use, we need to read content at least one time... let's just do memory compare then.
When updating a key, if the
CRC32
of the payload and the length of the payload matches with the existing data and length, the update is skipped. This was on the original specifications to avoid non-needed. But after investigation, rely onCRC32
and length it not enough, it seems to be quite easy and probable that sameCRC32
with same length occurs even on small dataset with different payload.Since
sha1
is implemented for secure challenge, it's possible to reuse the hashing function. WhenCRC32
andlength
matches (which should not happen often, depending on the use case), it would be more safe to fetch old data, check the hash and compute the hash of the new data and ensure it's the same or not.