Closed agl closed 4 years ago
While I'm still wondering if a simple HTML element (no JavaScript) would help websites even more... if you're worried about backwards compatibility, you could add a navigator.credentials.getPublicKey(result)
method to return the key in a well supported format (e.g. PEM); where websites could use a polyfill script for a few years.
Actually, call the method navigator.credentials.parseAuthData(result)
, where it takes the result of calling create()
or get()
, and returns something like:
{
"rpIdHash": "afb64c14d8723ef066d1e108dd60adec30447611664958a5587cdf806ba5ab6b",
"flags": {
"UP": true,
"RFU1": false,
"UV": false,
"RFU2a": false,
"RFU2b": false,
"RFU2c": false,
"AT": true,
"ED": false
},
"signCount": 0,
"attestedCredentialData": {
"aaguid": "AAAAAAAAAAAAAAAAAAAAAA==",
"credentialId": "mGYJM5RrXM1b",
"publicKey": {
"type": 2,
"algorithm": -7,
"curve_type": 1,
"curve_x": "uELJlQrFdsxGjthRcbrcNwMKDGbsaEoP4T5T6JBdGQM=",
"curve_y": "XBZY+ZCfmnQia65ZO17sHuD0FkUoAwIbE39G/EfChjI="
}
"publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEuELJlQrFdsxGjthRcbrcNwMKDGbs\naEoP4T5T6JBdGQNcFlj5kJ+adCJrrlk7Xuwe4PQWRSgDAhsTf0b8R8KGMg==\n-----END PUBLIC KEY-----"
},
"extensions": null
}
So everyone can easily get all of the data out of result.response.attestationObject.authData
, and when it comes to create()
, it includes a human readable version of publicKeyBytes
, and a PEM formatted version of publicKeyBytes
(which I believe most websites will be able to use directly).
This is a duplicate of #557; there, the resolution was:
We discussed that instead of duplicating data, to instead add operations (getters) to obtain web-friendly forms (such as a CryptoKey object).
I agree this seems like the cleanest way forward - probably as methods or getters on PublicKeyCredential
? Then those methods can be polyfilled in older browsers. In fact, we can even use such a polyfill as a prototype for evaluating any proposed API changes. I'll volunteer to produce one if we want to go ahead with this.
In the interest of having something concrete to work with, I'll propose updating the AuthenticatorAttestationResponse
API to the following:
[SecureContext, Exposed=Window]
interface AuthenticatorAttestationResponse : AuthenticatorResponse {
[SameObject] readonly attribute ArrayBuffer attestationObject;
sequence<DOMString> getTransports();
[SameObject] readonly attribute ArrayBuffer rpIdHash;
AuthenticatorDataFlags getFlags();
readonly attribute unsigned long signCount;
[SameObject] readonly attribute ArrayBuffer aaguid;
dictionary getPublicKeyJwk();
[SameObject] readonly attribute ArrayBuffer? extensions;
};
dictionary AuthenticatorDataFlags {
readonly attribute boolean UP;
readonly attribute boolean UV;
};
where
attestationObject
and getTransports
are unchanged.rpIdHash
is the RP ID hash in the authenticator data.getFlags()
returns an AuthenticatorDataFlags
object as described below.signCount
is the signature counter in the authenticator data.aaguid
is the AAGUID in the attested credential data.getPublicKeyJwk()
returns the credential public key encoded as a JWK [RFC 7518] formatted object.extensions
is null
if the ED
flag is 0, and the raw extensions part of the authenticator data if the ED
flag is 1.AuthenticatorDataFlags
is a new type:
UP
is bit 0 (user present) of the authenticator data flags.UV
is bit 2 (user verified) of the authenticator data flags.
The AT and ED flags are omitted since their purpose is for parsing the authenticator data.Analogues for AuthenticatorAssertionResponse
are deliberately omitted, because for assertions the values should not be trusted without verifying the signature. If you have to verify the pre-parsed values against the signed byte array anyway, then you're not really benefiting from the pre-parsed values.
@emlun, I’ve got the beginnings of this in tidy.js, although I think the PEM encoding should move out to a publicKeyPem
value, so the main publicKey
can retain the same structure as the original.
Another thought I just had: maybe it would make sense to make the simplified API available only if the request sets attestation: "none"
(either explicitly or by default)? In every other case the RP does actively care about attestation, so then it seems prudent to not expose the insecure API variant so it can't be used by mistake.
Somehow I missed the reference to CryptoKey
in the quote earlier. getPublicKeyJwk()
should probably be [SameObject] readonly attribute CryptoKey publicKey;
instead.
wrt @emlun
getPublicKeyJwk()
should probably be[SameObject] readonly attribute CryptoKey publicKey;
instead
hm, in looking at CryptoKey
and Crypto
and Crypto.subtle
, I'm not sure we'd want to necessarily do that because in having declared "none" attestation ostensibly the RP is implicitly declaring that they are not going to be doing attestation signature verification and only need the publicKey
and associated data as user account metadata. In contrast, if we return a CryptoKey
object, that implies use of that and the other WebCrypto interfaces' methods to do various crypto operations using the key, but the RP isn't ostensibly isn't going to be doing such with that publicKey
(at least not in their client-side JS?).
Maybe just returning the publicKey
as a JWK "blob" is sufficient, otherwise, if returned as CryptoKey
object then the RP will just call publicKey.export("jwk")
anyway?
[ Also, I suspect we would need to look closely at the various involved specs to ascertain whether the various crypto algs (and parameters thereof) WebAuthn uses (IANA-COSE-ALGS-REG) and JWK's and those that WebCrypto supports are congruent. not a big task but necessary due diligence IIUC. ]
@equalsJeffH Yeah, my expectation is that RPs would just export the CryptoKey
rather than use it for crypto operations in JS. My thinking was that returning CryptoKey
would provide different format options "for free" instead of picking just one to support. @craigfrancis seems to prefer PEM, for example, while i guess JWK is the most appropriate "web-native" format if we were to pick just one.
[ Also, I suspect we would need to look closely at the various involved specs to ascertain whether the various crypto algs (and parameters thereof) WebAuthn uses (IANA-COSE-ALGS-REG) and JWK's and those that WebCrypto supports are congruent. not a big task but necessary due diligence IIUC. ]
Good point!
Using exportKey()
from CryptoKey
works for me.
I only selected PEM because it's already base64 encoded (easy to send to the server), and I could provide it directly to openssl_verify()
.
Admittedly I'm not sure if I should be trusting the PEM value like that, as it's a value that's come from the (potentially hostile) user - as in, could they provide a value that's dangerous? denial of service?
I've also had a look at some of the other projects (notes below), and while most seem to work with the X and Y values directly, PEM/DER was fairly common.
Ruby: cedarcode/webauthn-ruby, uses
X/Y
.* [/lib/webauthn/public_key.rb](https://github.com/cedarcode/webauthn-ruby/blob/114a96d20be6504116dc8fcb2570633eb89ab160/lib/webauthn/public_key.rb#L25) * [/lib/webauthn/authenticator_assertion_response.rb](https://github.com/cedarcode/webauthn-ruby/blob/4dc71eb7a8bf512648c86c536147824c4295e2fe/lib/webauthn/authenticator_assertion_response.rb#L57)
FWIW, speaking on behalf of webauthn-ruby at least, I think it is a tiny bit more accurate to say we are working with the "COSE Key format" (ref), not with X/Y directly (We switched while ago in order to support RSA keys whose params are not X/Y) by relying on cose-ruby.
Thanks @grzuy, sorry for the guesswork - as you can probably tell, I'm not familiar with Ruby.
I'm just trying to work out what the best export format(s) would be.
My understanding of the "COSE Key Format" is that it's still effectively binary, so I assume you would need something like base64 encoding to get it back to your server, where it will be stored, and later used for verification (would that need a server side CBOR decoder as well?).
So I'm wondering, do you think the "COSE Key Format" is the best approach for all Ruby projects? or are there better formats?
Ideally it would allow the transfer (browser to server), storage, and signature verification steps to be done using as few steps/dependencies as possible.
In the PHP world, I can pass the PEM encoded value directly to the server via a POST request, store it in the database, and pass it directly to OpenSSL with no extra dependencies (I'm still not sure if that's safe to do, but I will be checking that soon).
Thanks @grzuy, sorry for the guesswork - as you can probably tell, I'm not familiar with Ruby.
No worries :-)
I'm just trying to work out what the best export format(s) would be.
My understanding of the "COSE Key Format" is that it's still effectively binary, so I assume you would need something like base64 encoding to get it back to your server
Yes.
For anyone using webauthn-ruby
in the RP server, we recommend webauthn-json
for corresponding RP client code so that you get base64url
data instead of ArrayBuffer
's out of the WebAuthn API.
, where it will be stored, and later used for verification (would that need a server side CBOR decoder as well?).
Yes.
webauthn-ruby
uses cose-ruby
for credential public key deserialization/decoding, which partially implements RFC 8152 and uses cbor-ruby
behind the scenes.
So, in summary, the "flow" of the credential public key is:
WebAuthn API ==> webauthn-json ==> webauthn-ruby ==> cose-ruby ==> cbor-ruby
So I'm wondering, do you think the "COSE Key Format" is the best approach for all Ruby projects? or are there better formats?
With cose-ruby
out there, we just:
# After binary parsing credential_public_key_cbor out of Authenticator Data
credential_public_key = COSE::Key.deserialize(credential_public_key_cbor)
I hope eventually there will be a COSE library (at least having key deserialization) for every "somewhat popular" programming language. I see just a few in https://github.com/topics/cose, as of today.
Ideally it would allow the transfer (browser to server), storage, and signature verification steps to be done using as few steps/dependencies as possible.
In the PHP world, I can pass the PEM encoded value directly to the server via a POST request, store it in the database, and pass it directly to OpenSSL with no extra dependencies (I'm still not sure if that's safe to do, but I will be checking that soon).
@grzuy, thanks for the overview.
I'm just wondering if we could go a bit further, rather than every project needing to include multiple libraries/dependencies, could we get the requirements down to 0?
As in, avoid any parsing, and simply have the browser provide you something that can be:
In PHP, if the browser provided the key with PEM encoding, the signature checking step can be done with the core functions provided by PHP:
<?php
$key = '-----BEGIN PUBLIC KEY----- [...] -----END PUBLIC KEY-----'; // PEM Encoded
$verify = base64_decode($response['authenticatorData']);
$verify .= hash('sha256', base64_decode($response['clientDataJSON']), true);
$signature = base64_decode($response['signature']);
if (openssl_verify($verify, $signature, $key, OPENSSL_ALGO_SHA256) === 1) {
// Success
}
?>
So when it comes to Ruby, is there anything built in that can do the signature verification step? and if so, what format(s) does the key need to be in?
@agl Need a PR created stating solution
I don't see how CryptoKey
would work given it doesn't support all the algorithms that FIDO2 supports. Most notably ed25519
is missing from CryptoKey
whilst it's supported by COSE and FIDO2
For those following along, this should now be supported in Chrome Canary on both desktop and mobile.
@agl Can you clarify what "this" refers to? There are a few things proposed in here and it's unclear which one Chrome went with.
@MasterKale Sorry, I mean the functionality in the attached PR #1395. I.e. the getPublicKey
, getPublicKeyAlgorithm
, and getAuthenticatorData
functions now specified here.
@agl Thanks for adding these methods to the spec and Chrome Canary.
The JS to create and get is considerably easier now (well, they will be when available everywhere, or I get the time to create a polyfill).
The only minor annoyance is creating uint8array, and parsing array buffers, so they can be JSON friendly; but I don't think that's something that can be easily changed.
As to the choice of DER Encoding, I think you're right, it's much better than PEM, as that's just adding on an extra layer, which isn't too difficult if you're using openssl_pkey_get_public()
in PHP:
$pem = '-----BEGIN PUBLIC KEY-----' . "\n";
$pem .= wordwrap($der, 64, "\n", true) . "\n";
$pem .= '-----END PUBLIC KEY-----';
For anyone interested, in the last hour I've created a very basic polyfill.js, not done much testing, only works with algorithm -7 (ECDSA with SHA256), and I've not found a way to conditionally load it.
Anybody have a working Java code snippet on how to verify the publicKey (from AuthenticatorAttestationResponse.getPublicKey()) on the Java server side?
This is what I have and it runs through but I always get isCorrect==false
byte[] clientDataJSON = Base64UrlUtil.decode(json.getAsString("response.clientDataJSON"));
MessageDigest md = MessageDigest.getInstance("SHA-256");
byte[] clientDataHash = md.digest(clientDataJSON);
byte[] authenticatorData = Base64UrlUtil.decode(json.getAsString("response.authenticatorData"));
ByteBuffer signatureBase = ByteBuffer.allocate(authenticatorData.length+clientDataHash.length).put(authenticatorData).put(clientDataHash);
byte[] signature = Base64UrlUtil.decode(json.getAsString("response.signature"));
KeyFactory kf = KeyFactory.getInstance("EC");
X509EncodedKeySpec ks = new X509EncodedKeySpec(Base64UrlUtil.decode(<<publicKey from previous AuthenticatorAttestationResponse.getPublicKey()>>));
PublicKey publicKey = kf.generatePublic(ks);
Signature sig = Signature.getInstance("SHA256withECDSA");
sig.initVerify(publicKey);
sig.update(signatureBase);
boolean isCorrect = sig.verify(signature);
PS: AuthenticatorAttestationResponse.getPublicKey() is really great! I just spent 3 days trying to CBOR decode everything in Java on the server before I found this.
Any advice what I am missing in my code to verify the authenticatorData using the signature is highly appreciated!
Testcase info: Google Chrome on Android + Fingerprint, platform authenticator, public-key alg -7
(@CrazyChris75 this issue is closed so people might not see your updates.)
This line gives me pause:
byte[] clientDataJSON = Base64UrlUtil.decode(json.getAsString("response.clientDataJSON"));
clientDataJSON
is an ArrayBuffer that contains the JSON directly, i.e. not base64url encoded. Are you encoding the Javascript object yourself? Are you sure that the encoder is using base64url, not base64?
@agl thanks for your reply!
I should have mentioned that YES, I encode all JS ArrayBuffers to base64url before sending data to the Java server and the byte[] clientDataJSON
from the code above actually contains a vaild JSON String (with type, origin, androidPackageName)
I assume the problem is somewhere in Java and my attempt to verify the signature using SHA-256, "EC" KeyFactory, X509 Encoded Key and SHA256withECDSA Signature - since I am not sure if that is at all correct. But there are so many potential points of failure in this that I could be totally wrong of course.
There's nothing that I can see that's obviously wrong with the code above. I would check that values are what you expect by hex dumping them in Javascript, and on the server, and confirming that nothing has gotten crossed, or weirdly encoded.
The public key for a freshly created credential is provided inside of the attestation object. However, that is a somewhat complex format that involves decoding CBOR in order to read the public key. If a site doesn't care about attestation (as many won't) we might usefully be able to have browsers provide fields of this structure more directly.
Assumption: absent attestation, web site implementations wouldn't need CBOR if we did this. This appears to be true at first glance since the authenticator data is a fixed-offset binary format (not including extensions).
A reason not to do this would be that it encourages sites to depend on these additional fields, which will only be available in newer browsers. Thus people with older browsers might not be able to use WebAuthn, even though they could if sites put in more work. However, this argument applies to any such ergonomic improvement to the API and so, if we buy it, we're forced to conclude that they're mostly a bad idea as a class.