Closed brson closed 9 years ago
There's some pure-Rust work on https://github.com/DaGenix/rust-crypto. http://nacl.cr.yp.to/ is probably among the most trustworthy libraries at this point, but hasn't been (to my knowledge) widely deployed.
Oh, and I almost forgot about https://github.com/dnaq/sodiumoxide, which is an initial attempt at binding libsodium
Is the goal of the crypto library to provide rust interfaces to a wide set of commonly used algorithms? If so, I don't think NaCL / libsodium are a great fit since I don't believe they include algorithms such as MD5 or TLS.
For just making things secure NaCL seems to be the go to library, but it only supports certain cryptographic primitives which make it not at all useful for a library that wants to provide general cryptographic code. It is not designed to be a general purpose one stop shop cryptographic library. For example it will be impossible to create a standards complying TLS implementation with NaCL because it does not provide AES CBC.
Some cryptographic functions that NaCL does not support:
For what its worth Go implements most of their cryptographic code in Go (however Adam Langley has admitted that they have not mitigated timing attacks in their crypto/rsa package).
FWIW, I am contributor to the cryptography library for Python. You can consider taking the approach we did here and design a "pluggable" API that bindings existing C libraries. For a general purpose crypto library, NaCL is almost definitely not a good fit because common algorithms aren't present in it. While there has been much bad press about OpenSSL lately, it is really the most viable library to wrap since it is available on all common platforms and almost certainly already installed on Linux distros. The current scrutiny on it should do well to turn up security bugs as well. Our API also allows other C libraries to be used (we bind commoncrypto on OS X as well, and I have PolarSSL bindings in the works.)
If designed well, I can see the potential Rust crypto library at first being dependent on C bindings but eventually gets seamlessly replaced by Rust code without much backwards incompatibility.
I highly recommend you package something like NaCl as a high-level crypto interface - call it "crypto/box" or similar. This is the package you want the vast majority of application developers to use.
Other specific protocols should similarly be packaged in high-level interfaces when it makes sense to provide them, e.g. "crypto/tls".
While a lower-level set of primitives is necessary for protocol implementers, documentation and naming conventions should steer people away from them. Call this package "crypto/low-level" or "crypto/arcane" or something like that.
This Rust crypto library should be mentioned (surprised it hasn't been): ClearCrypt
ClearCrypt is not a general-purpose crypto library but an implementation of an encryption protocol
On Thu, Jun 5, 2014 at 10:04 AM, Greg Slepak notifications@github.com wrote:
This Rust crypto library should be mentioned (surprised it hasn't been): ClearCrypt https://github.com/clearcrypt/clearcrypt
— Reply to this email directly or view it on GitHub https://github.com/mozilla/rust/issues/14655#issuecomment-45246271.
@cmr: ClearCrypt contains many parts of a general-purpose crypto library, similar to how OpenSSL does (though it's an implementation of the SSL/TLS protocol).
It doesn't now, and since it won't have the mess of supported ciphers etc that TLS does, I doubt it will (and it shouldn't!).
On Thu, Jun 5, 2014 at 10:14 AM, Greg Slepak notifications@github.com wrote:
@cmr https://github.com/cmr: ClearCrypt contains many parts of a general-purpose crypto library, similar to how OpenSSL does (though it's an implementation of the SSL/TLS protocol).
— Reply to this email directly or view it on GitHub https://github.com/mozilla/rust/issues/14655#issuecomment-45247418.
Right, it's a WIP, I just wanted to mention it. I hope I didn't offend you by doing so. ;)
We do also have the https://github.com/sfackler/rust-openssl bindings, maintained by @sfackler and myself.
My current opinion is that we should not distribute any crypto written in Rust, but that distributing bindings to well-regarded crypto is fine.
I would also like to point out the silliness of inventing a safe programming language only to encourage writing bindings back to C. HT to @dchest for phrasing that so well.
The "sillyness" comes from new crypto requiring audits by cyptographers. The Rust team doesn't have any dedicated at the moment. So it's "safer" to use already-audited code in C.
The "sillyness" comes from new crypto requiring audits by cyptographers. The Rust team doesn't have any dedicated at the moment. So it's "safer" to use already-audited code in C.
Uh huh. And which code would that be?
The "sillyness" comes from new crypto requiring audits by cyptographers. The Rust team doesn't have > any dedicated at the moment. So it's "safer" to use already-audited code in C.
Uh huh. And which code would that be?
Hopefully LibreSSL in the not so distant future?
LibreSSL is only going to be portable if and when OpenBSD starts receiving funding for it, and there's not a great history of OpenBSD projects receiving funding.
While there has been much bad press about OpenSSL lately, it is really the most viable library to wrap since it is available on all common platforms and almost certainly already installed on Linux distros.
NSS would fit the cross-platform bill too. It's used in Firefox, after all.
My current opinion is that we should not distribute any crypto written in Rust
I can understand why for some things, but then choosing OpenSSL at this point would make most people wonder what's the point of Rust after all. Anyways, I don't see why not implement things like digests in rust straight away (and they are already in rust-crypto, for that matter, I've been using Sha1, Md5 and Hmac from there). Sure, that would be redundant with whatever library whose bindings would be distributed, but so could be said of a lot of things in the standard library, as being redundant with the system libc.
Most of the flaws in the most recent OpenSSL security advisory are not related to memory safety.
I can understand why for some things, but then choosing OpenSSL at this point would make most people wonder what's the point of Rust after all.
Just to put it on the record, I am one of those people. :+1:
Also, sorry to be off-topic, but @glandium, have you realized that your "randomly generated" GitHub profile picture is flipping all of us off? ;)
SUPERCOP includes many highly optimized implementations of a wide variety of crypto primitives. The trouble is that it is absolutely not designed to make them available to programming languages, only to benchmark them. Using them is hard work! I wrote this tool which builds a shared library out of SUPERCOP primitives - it was not straightforward:
http://hg.opensource.lshift.net/bletchley-primitives/file/tip
In a perfect world, I think Rust would provide an abstract, high-level API for encryption backed by multiple "providers" ala the Java Cryptography Architecture:
http://docs.oracle.com/javase/6/docs/technotes/guides/security/crypto/CryptoSpec.html#Design
This approach would allow Rust to have a cryptography story without actually having to vendor or otherwise ship actual encryption code as part of the core distribution.
Instead, ship the multi-provider API as part of core Rust, and developers can plug in the provider modules that are appropriate to a particular purpose.
In a perfect world, I think Rust would provide an abstract, high-level API for encryption backed by multiple "providers" ala the Java Cryptography Architecture
@tarcieri That's a good idea. Seems like the right approach. :+1:
In what way is crypto different from other algorithms provided in standard libraries, that crypto needs a provider architecture but other algorithms don't? I think JCA exists because of legal problems at the time with directly shipping crypto libraries, not some technical reason.
In what way is crypto different from other algorithms provided in standard libraries
Agility around algorithms is more important in crypto than it is with your average data structure. A multi-provider architecture lets you supply a wider range of algorithms than are available in a single library under a single API, and likewise swap out bad algorithms for good ones in the future.
I can see the advantages of being able to naturally express things like "this block cipher used in counter mode provides a stream cipher" but that seems like an application for subtyping/trait implementation rather than for a provider architecture. Can you give an example of the kind of scenario where you're glad you used a provider architecture?
One recent example: Ruby ships only a thin wrapper around OpenSSL, and most Macs ship with ancient versions of OpenSSL that don't support AES-GCM.
It would be "nice" to be able to leverage the rest of OpenSSL, but provide an alternative GCM implementation in the event that GCM is unavailable. Instead of being able to plug in an alternative implementation, we have to use a separate API when doing GCM-with-OpenSSL vs GCM-without-OpenSSL.
Another reason: provider architecture lets you pick the best available implementations of ciphers regardless of what platform you're on. As a project like eBACS shows, this will vary widely depending on implementation and CPU architecture:
Don't know if this was mentioned already, but implementing a generic provider-based API allows you to focus on perfecting the API itself without the distractions of the actual implementation. It forces providers to "shoot for an ideal" in other words, that they otherwise might not.
Do not use NSS. The actual cryptography bits of NSS are not even close to being on par with the OpenSSL (LibreSSL/BoringSSL) implementations. For example, OpenSSL has constant-time implementations of ECC private key operations and NSS doesn't. Also, OpenSSL has many more assembly-language optimized implementations of algorithms that can only be efficiently and safely implemented in assembly language, whereas NSS has only a few. Also, the recommended API to NSS is pk11wrap, which is a wrapper around NSS's PKCS#11 interface, which is a wrapper around the actual crypto code (freebl). The PKCS#11 wrapper is a huge amount of complicated, hard-to-decipher, unnecessarily inefficient code. And, the PKCS#11-based design means that certain optimizations (e.g. AES-GCM optimizations necessary for efficient implementation of AES-GCM TLS cipher suites) are difficult or impossible to do.
In general, the OpenSSL implementations of the low-level crypto primitives that would be used in a a browser are the best or on their way to becoming the best soon. The stuff on top of those low-level primitives leaves a lot to be desired. Also, like I mentioned above, a lot of that code is in assembly language by necessity. Therefore, I think it is useful to consider the idea of creating Rust wrappers around the low-level primitives of some branch of OpenSSL, whether it is LibreSSL or BoringSSL or whatever, perhaps throwing much of the rest of OpenSSL away.
The Go crypto libraries include an implementation of djb's NaCl crypto library. Having rust bindings to libsodium (the NaCl code with better build infrastructure) I don't think is totally unreasonable for the rust standard library.
On Thursday, July 3, 2014, Damian Gryski notifications@github.com wrote:
Having rust bindings to libsodium (the NaCl code with better build infrastructure) I don't think is totally unreasonable for the rust standard library
I don't think vendoring libsodium into Rust makes sense. It's an evolving library that's constantly adding new features. Vendoring it into the Rust standard library would reduce agility around updates as it would only get updated along with the standard library.
Furthermore, NaCl/libsodium alone doesn't provide a comprehensive cryptography story. While I agree djb's ciphers are totally awesome, there are people with compliance obligations around cryptography (e.g. HIPAA) who have to use NIST ciphers, and may have even more stringent obligations like FIPS 140-2. I think FIPS sucks, but until the laws are changed there's not much that can be done about it.
I say this all as the maintainer of the most popular Ruby binding to libsodium (RbNaCl). I also package libsodium as a RubyGem. I think it would make way more sense for libsodium and a Rust binding to get packaged via Cargo. In fact, I just shipped a RubyGem of libsodium 0.6.0 this morning. (Also I designed the libsodium logo ;)
This is where a provider architecture is nice. You can use both more "traditional" libraries like OpenSSL and more modern libraries like libsodium/NaCl through a single, pluggable API. I've been working on a library like this for Ruby... perhaps I could work on a Rust version too? ;)
I've been working on a library like this for Ruby... perhaps I could work on a Rust version too? ;)
@tarcieri: cause you clearly don't have enough projects to work on already! :P
So the suggestion here is to provide traits such as crypto::Hash
, crypto::Hmac
, crypto::Digest
, crypto::Cipher
, and so forth?
Then other libraries could be generic over these traits:
impl Email {
pub fn anonymize<H: crypto::Hash>(&self, hash: H) -> String {
let Email( email ) = *self;
hash(email)
}
}
@seanmonstar that'd be a good way to implement a provider architecture, although finding a good least common denominator API for each of those may be difficult.
Perhaps the authors of Rust crypto libraries could try to extract a common set of traits and send an RFC for inclusion of those traits into the Rust stdlib.
Most people here probably already know, but for the record a Rust crypto meeting happened at Mozilla SF in December 2014. Here's a link to the recording
I made a remark about this issue in my talk at Mozilla SF in December 2014 and I just wanted to clarify.
My remark was specifically that Rust should not have an in-tree crypto story, and it highlighted this issue.
This comes with some caveats. My intended point was about implementing ciphers (and crypto-protocols built on them) in-tree, and saying I thought that was a bad idea. My analogy was to Ruby's OpenSSL extension, which is a binding to OpenSSL that is an intractable part of the standard library. There is an effort underway to rip OpenSSL out of the Ruby standard library and move it into an out-of-tree external package.
Rust has various in-tree APIs for hash functions (e.g. std::hash
) and I think these are non-controversial and totally fine. In fact I think std::hash::Hasher
is great and one of the other remarks I made in my talk is I totally want to see more traits like this.
tl;dr: hash functions are fine, but please leave encryption ciphers and public-key digital signature algorithms (and especially crypto protocols!) to out-of-tree libraries developed by crypto experts.
Say a little more about why hash functions are an exception? Thanks!
First, there is one hash function you absolutely do want in the standard library: something like SipHash to use as the basis of a (hashDoS-resistant) hashing function:
http://doc.rust-lang.org/0.11.0/collections/hash/sip/fn.hash.html
Without a cryptographically secure hash function like SipHash, any time an attacker provides the data to be put into e.g. a HashMap
, they can potentially collide the buckets the data is placed into, intentionally causing poor algorithmic performance, DoSing the application. This is hashDoS.
Secondly, having hash functions in the standard library also gets you around the bootstrapping problem of how to verify artifacts in the package manager for the language itself.
It's very easy verify that a custom hash implementation is correct, especially in rust where you don't have to worry about memory safety as much. Checking an entire protocol or cipher is much more difficult, because there are so many possible execution paths.
Modern block ciphers and stream ciphers both tend to be incredibly simple, and there aren't multiple execution paths in general. It's not significantly different from hashes.
@tarcieri: Having a secure random number generator for the hash seeds and other use cases means having a stream cipher implementation too.
@thestinger you need an API for getting cryptographic randomness, but you only need to use it once at the time the program starts. Once you have the key for SipHash seeded you can reuse the same key for all hashing operations globally. I say this specifically in regard to hashDoS.
Totally support SipHash. But I'm not sure I see the point of including other hash functions. Without a public key signature scheme, you can only verify hashes you already hold, which seems pretty limiting for the applications you set out.
A secure random number generator does seem desirable, but it's a very tricky proposition—it's very platform dependent, it's likely to depend on a good stream cipher and a good hash function, and if there's any platform on which you find you need to do your own entropy management you'll end up in deep water fast. I'm not sure it's a good feature to try to standardize for the 1.0 release.
Also small terminological note: SipHash is a PRF, not a message digest function like SHA-2/Blake2.
It's not a hash function, it's a PRF? Really? Next you'll be telling me it's not a block cipher, it's a PRP :wink:
Anyway, I think your point was that people shouldn't be using SipHash for anything other than the hash function driving e.g. a HashMap data structure, in which case, yes, that's true.
SipHash is definitely not fungible with e.g. SHA-2/Blake2 and people shouldn't be using it unless they know what they're doing.
@ciphergoth: SipHash is useless without a seed, so the OS requirements are the same for both SipHash and a CSPRNG... the requirements are there on every platform Rust supports.
We've previously made the decision not to distribute any crypto with Rust at all, but this is probably not tenable since crypto is used everywhere. My current opinion is that we should not distribute any crypto written in Rust, but that distributing bindings to well-regarded crypto is fine.
Figure out a strategy here, build consensus, then start implementing a robust crypto library out of tree, with the goal of merging into the main distribution someday. There are some existing efforts along these lines that should be evaluated for this purpose.