Closed trishankkarthik closed 2 years ago
It seems that cryptography
will support Keccak/SHA-3 once openSSL does.
What's our story for Keccak (aka SHA-3)?
Is there documentation that we can look at for how to use simplesha3
? There isn't much on their README.
Glancing at the .c file, I tried the following:
$ pip install simplesha3
$ python
>>> import simplesha3
>>> simplesha3.sha3256('hash me')
'\x8dGH\x17\xce&\x1f\x93\xfc\xb4\xee5\xf2\xd9\x03\xe6u\x1b\xbe\x17\x9e\x9d|wB<v\xac\xd0\xdb\xa9\xe5'
>>> simplesha3.keccakc512('hash me')
'\xf4_\xa8O\x8b\xfd\xfbXE\x8d\xf9\x99\xc0\x8fX\xb7=6\xf94\x8c\xd7{\xc7\x86\xdek\x95%\x9dg\xe5'
It would be nice if there was an update() function, create a hash object, etc.
You might be interested in trying out bpython
as your interpreter. It makes it super simple to poke around in new libs when the docs are lacking (or wrong/out of date/whatever).
On 23 March 2017 at 19:59, David Lawrence notifications@github.com wrote:
You might be interested in trying out bpython as your interpreter. It makes it super simple to poke around in new libs when the docs are lacking (or wrong/out of date/whatever).
+1 --- love bpython
One thing to keep in mind is that "SHA-3" is explicitly designed to be a drop-in replacement for SHA-2. Since you don't actually need that here, there are better alternatives.
This email from the Keccak Team covers several of them:
https://public-inbox.org/git/91a34c5b-7844-3db2-cf29-411df5bcf886@noekeon.org/
SHAKE128 in particular is interesting because it's part of the SHA-3 family (and should hopefully be available anywhere SHA-3 is), but is faster as they optimized for performance instead of NIST's requirements. It's technically an Extensible Output Function (XOF) but could be used as a replacement for a hash function in something like TUF.
In order to increase the diversity of cryptographic hash algorithms used in TUF
As a counterpoint, some advice from Adam Langley regarding SHA-3: maybe you should skip it:
https://www.imperialviolet.org/2017/05/31/skipsha3.html
...diversity of cryptographic primitives is expensive. It contributes to the exponential number of combinations that need to be tested and hardened; it draws on limited developer resources as multiple platforms typically need separate, optimised code; and it contributes to code-size, which is a worry again in the mobile age. SHA-3 is also slow, and is even slower than SHA-2 which is already a comparative laggard amongst crypto primitives.
I concur with @tarcieri in principle. However, SHA-3 will probably be necessary in the long term regardless of its desirability factor, given it's the NIST approved next gen hash and TUF has highly regulated users through projects like Uptane. Would it make sense going forward though to restrict canonical TUF implementations to a very limited set of hashing (and maybe signing) algorithms (this could be part of the verification suite)?
@endophage you seem to be falling into the exact trap Adam Langley was describing:
Yet there is a natural tendency to assume that SHA-3 must be better than SHA-2 because the number is bigger.
There is a simple solution to "given it's the NIST approved next gen hash and TUF has highly regulated users", and that is:
SHA-2 is also a NIST-approved hash function, and there is no reason to suspect at this time that anything is going to change about that for the foreseeable future.
Regarding "projects like Uptane", there is definitely a reason to prefer SHA-2 for these sorts of projects: SHA-2 has ubiquitous hardware implementations, and SHA-3 does not.
Not at all. I'm making no claim that SHA-3 is "better", only that TUF has a meaningful number of users through projects like Uptane in an industry that will likely require compliance with NIST standards, and therefore, as I said, it will be necessary.
They can be compliant with NIST standards by using SHA-2 (and I say this as someone whose day-to-day is working in a strict FIPS 140-2 level 3 environment)
For some period of time, yes. The NSA though released new guidelines last year (maybe even 2015, can't remember), telling people to upgrade to SHA 512 or better, so I'm not convinced we'd be doing harm by future proofing. This issue, unless mis-worded, is for support, not requiring usage.
As far as the spec, it wouldn't have to only be SHA-2 and SHA-3. It could be some minimal set that includes those and a carefully hand-picked additional one or two hashing algorithms.
On Wed, May 31, 2017 at 1:16 PM David Lawrence notifications@github.com wrote:
For some period of time, yes. The NSA though released new guidelines last year (maybe even 2015, can't remember), telling people to upgrade to SHA 512 or better, so I'm not convinced we'd be doing harm by future proofing. This issue, unless mis-worded, is for support, not requiring usage.
As far as the spec, it wouldn't have to only be SHA-2 and SHA-3. It could be some minimal set that includes those and a carefully hand-picked additional one or two hashing algorithms.
Is it necessary for the TUF specification to require certain hashing and signing algorithms over others? Can't this be left to the implementation? The spec can certainly provide recommendations.
Although if an implementation is free to choose the hashing and signing algorithms, conformance testing might not be as straightforward.
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/theupdateframework/tuf/issues/408#issuecomment-305255779, or mute the thread https://github.com/notifications/unsubscribe-auth/ADW5c8iFbSZslNe5O6m8nRfsGlzT-OMiks5r_aBqgaJpZM4K4tkW .
--
vladimir.v.diaz@gmail.com PGP fingerprint = ACCF 9DCA 73B9 862F 93C5 6608 63F8 90AA 1D25 3935
It depends on what protections would be lost if an implementer chose a bad hashing algorithm. For the purposes of security and verification, it might be beneficial for the spec to include very specific requirements on hashing algorithms to strengthen the guarantees provided by the verifier.
So, as we've worked more and more with people employing TUF on weak, embedded devices, I have become more convinced that we should not specify the exact hashing algorithms used. We certainly don't want TUF to be prohibitive just because of that recommendation, if they are making a pragmatic and calculated decision.
However, I think that it would be a good idea to strongly recommend the use of certain algorithms. Of course, we would update this list over time...
On Wed, May 31, 2017 at 1:34 PM, David Lawrence notifications@github.com wrote:
It depends on what protections would be lost if an implementer chose a bad hashing algorithm. For the purposes of security and verification, it might be beneficial for the spec to include very specific requirements on hashing algorithms to strengthen the guarantees provided by the verifier.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/theupdateframework/tuf/issues/408#issuecomment-305260985, or mute the thread https://github.com/notifications/unsubscribe-auth/AA0XD8Vlq-hWiLMFACDNIL7X9wS5nRm4ks5r_aSpgaJpZM4K4tkW .
The drawback of algorithm agnosticism is the creation of a non-interoperable, fractured ecosystem, and forcing implementations which try to be fully interoperable turning into a sort of "cipher zoo".
The SHA-2 family is ubiquitous, and should be available in some form on practically any device you can name. Optimized implementations are available for extremely low-powered microcontrollers, for example 8-bit AVR uCs: https://eprint.iacr.org/2012/156.pdf
I would strongly suggest picking a single "recommended" hash function from the SHA-2 family, either SHA-256 or SHA-512. SHA-256 is friendlier to low-power devices, and given the upcoming Intel SHA extensions support SHA-256 and not SHA-512, SHA-256 seems like the clear choice to me.
It doesn't have to be "Mandatory to Implement", but I think recommending SHA-256 makes a lot more sense than trying to remain completely algorithm agnostic.
Sounds good. Any objections to us recommending SHA-256?
On Wed, May 31, 2017 at 2:55 PM, Tony Arcieri notifications@github.com wrote:
The drawback of algorithm agnosticism is the creation of a non-interoperable, fractured ecosystem, and implements which attempt to unite them turning into a sort of "cipher zoo".
The SHA-2 family is ubiquitous, and should be available in some form on practically any device you can name. Optimized implementations are available for extremely low-powered microcontrollers, for example 8-bit AVR uCs: https://eprint.iacr.org/2012/156.pdf
I would strongly suggest picking a single "recommended" hash function from the SHA-2 family, either SHA-256 or SHA-512. SHA-256 is friendlier to low-power devices, and given the upcoming Intel SHA extensions support SHA-256 and not SHA-512, SHA-256 seems like the clear choice to me.
It doesn't have to be "Mandatory to Implement", but I think recommending SHA-256 makes a lot more sense than trying to remain completely algorithm agnostic.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/theupdateframework/tuf/issues/408#issuecomment-305283202, or mute the thread https://github.com/notifications/unsubscribe-auth/AA0XD7XRh5vGHKejxd8LSq-woK4VrQ8Mks5r_beIgaJpZM4K4tkW .
NSA has specifically ceased to recommend SHA-256 in the Commercial National Security Algorithm Suite (CNSA): https://www.iad.gov/iad/library/ia-guidance/ia-solutions-for-classified/algorithm-guidance/cnsa-suite-and-quantum-computing-faq.cfm (you'll get a cert error because they use their own CA)
As someone who works in a strict FIPS 140-2 level 3 environment, let me opine:
1) This has nothing to do with strict FIPS 140-2 level 3 compliance. If compliance is your concern, SHA-256 will still be compliant 2) Why did they do this? This seems to be a conservative hedge against Grover's algorithm causing less-than-128-bit preimage resistance for SHA-2. There is no reason to actually believe this will actually happen in the real world. In fact if we attempt to provide a good faith estimate of what it would take to break an algorithm at the same security level as SHA-2 using Grover's algorithm today, it is astronomically outside of what its practically possible. 3) Presuming there isn't a catastrophic failure of SHA-2 as an algorithm, or major developments in applying Grover's algorithm to hash functions, there is neither any reason to presume that Grover's algorithm would even apply to SHA-256. I would put this in the same realm as "assuming the sky doesn't turn green" or "assuming the ocean doesn't turn to lava"
All that said, as someone who spends each day day-after-day working in strict FIPS 140-2 level 3 environments, I still recommend SHA-256, and see no reason in future decades why SHA-256 would not be allowed in strict FIPS 140-2 level 3 environments.
SHA-512/256 (truncating SHA-512 to 256-bits) is also a very valid option, but less forward-thinking in terms of low-power devices are less likely to support SHA-512, as well as 32-bit devices cannot compute SHA-512 as efficiently in software as SHA-256.
I consider a recommendation from the NSA to upgrade to be extremely high signal. If you just want to stick with the SHA-2 family, would SHA-512 not be satisfactory? We could happily meet both FIPS compliance and the NSA's advice.
SHA-512 is fine, and in fact SHA-512 (possibly truncated to 256-bits ala SHA-512/256) is the fastest option available for the SHA-2 family implemented in software on modern x86 processors.
Just keep in mind:
1) The NSA's argument, for the reasons I just explained, is garbage and not in any way normative in terms of FIPS standards 2) SHA-512 won't be available in hardware on future Intel processors. SHA-256 will. 3) Hardware implementations of SHA-512 are often not available on low-power microcontrollers with crypto accelerators, where SHA-256 will be.
The drawback of algorithm agnosticism is the creation of a non-interoperable, fractured ecosystem
This is going to happen regardless, especially once they drop the requirement for. If one uses JSON, another DER, another protobufs, there is no way they are going to interoperate. There's also a fair bit of optional features, not to mention no specification on what types of keys should be used. They just happen to use Ed25519 in the spec and say "RSA could work too."
I don't think there should be a huge fight to make everything cross compatible since this is for updating which means it is already highly application/context specific.
If one uses JSON, another DER, another protobufs, there is no way they are going to interoperate.
There are potential ways to pull that off, but that's a topic for another thread.
There have been a couple interesting blog posts on this topic over the past few days:
Also notable is Tom Ptacek's comment that SHA-512/256 mitigates length extension attacks
Hi! I'd like to try to work on this issue. It would be my first one. Can I ask you to navigate me through so I'll be able to help?
@trkohler Find sha256
in the securesystemslib codebase. What we need to do is not just add SHA3-256, but refactor the original sha256
to mean SHA2-256. We can probably safely remove sha512
(by which we mean SHA2-512) after that.
While we are at it, we might want to add support for BLAKE3.
@trkohler Thanks for the interest. You might also want to read the pull request that added [BLAKE2 support] (https://github.com/theupdateframework/tuf/pull/993) as it will most likely be similar to this issue.
Current state:
path_hash_prefixes
use SHA-256, per specificationI'll close this: issues about changing path_hash_prefixes
algorithm should be filed in the specification repository, issues for new MetaFile/TargetFile hashes should go to securesystemslib
In order to increase the diversity of cryptographic hash algorithms used in TUF, we should also support SHA-3 besides SHA-2 in our reference implementation. The simplesha3 library is a nice implementation that we could use to do so.