kmackay / micro-ecc

ECDH and ECDSA for 8-bit, 32-bit, and 64-bit processors.
BSD 2-Clause "Simplified" License
1.26k stars 459 forks source link

P-256 Code Failing Test Vectors #141

Open karthikbhargavan opened 6 years ago

karthikbhargavan commented 6 years ago

We've been running a bunch of test vectors (from NIST and Wycheproof) on the micro-ecc code (32-bit P-256) and found three test failures, as documented below.

The first failing test is on ECDSA signatures. The test is taken from the Wycheproof Test Vectors at https://github.com/google/wycheproof/blob/master/testvectors/ecdsa_secp256r1_sha256_test.json. Particularly, the test number 221 (https://github.com/google/wycheproof/blob/2904be69e9d666bf3064fdc15093747e695cfae6/testvectors/ecdsa_secp256r1_sha256_test.json#L1952) { "tcId" : 221, "comment" : "Edge case for Shamir multiplication", "msg" : "3639383139", "sig" : "3044022064a1aab5000d0e804f3e2fc02bdee9be8ff312334e2ba16d11547c97711c898e02206af015971cc30be6d1a206d4e013e0997772a2f91d73286ffd683b9bb2cf4f1b", "result" : "valid", "flags" : [] }

The micro-ecc code rejects this test as an invalid signature.

We suspect that the problem could be here (https://github.com/kmackay/micro-ecc/blob/601bd11062c551b108adbb43ba99f199b840777c/uECC.c#L1545). The error happens when (tx == rx), when the code zeroes the z coordinate. This means that the computations of the function 'apply_z' (line 1555) will return the wrong result - 0, causing the signature check to fail. We modified the code to disable this branch and it passed all tests.

The two other failing tests are in the P-256 point multiplication code and the two tests are taken from the NIST dataset at http://point-at-infinity.org/ecc/nisttv

k = 115792089210356248762697446949407573529996955224135760342422259061068512044367 K(hex) = 0xFFFFFFFF00000000FFFFFFFFFFFFFFFFBCE6FAADA7179E84F3B9CAC2FC63254F

The expected result is: x = 7CF27B188D034F7E8A52380304B51AC3C08969E277F21B35A60B48FC47669978 y = F888AAEE24712FC0D6C26539608BCF244582521AC3167DD661FB4862DD878C2E

The result of the execution: 0000

k = 115792089210356248762697446949407573529996955224135760342422259061068512044368 k(hex) = 0xFFFFFFFF00000000FFFFFFFFFFFFFFFFBCE6FAADA7179E84F3B9CAC2FC632550

The expected result is: x = 6B17D1F2E12C4247F8BCE6E563A440F277037D812DEB33A0F4A13945D898C296 y = B01CBD1C01E58065711814B583F061E9D431CCA994CEA1313449BF97C840AE0A

The result of the execution:

These two appear to be cases where k = base-point - 1 and base-point - 2 The cause for these bugs needs investigations.

(Reported by N Kulatova, in collaboration with K Bhargavan and B Beurdouche)

niooss-ledger commented 3 years ago

Hello, is there any plan to fix this issue? It seems not to be patched, after 3 years of having been reported.

More precisely I have written the following program, using the numbers given in the previous comment:

#include "uECC.h"

#include <assert.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>

void vli_print(char *str, uint8_t *vli, unsigned int size) {
    printf("%s ", str);
    for(unsigned i=0; i<size; ++i) {
        printf("%02X ", (unsigned)vli[i]);
    }
    printf("\n");
}

void strtobytes(const char* str, uint8_t* bytes, int count) {
  for (int c = 0; c < count; ++c) {
    if (sscanf(str, "%2hhx", &bytes[c]) != 1) {
      printf("Failed to read string to bytes");
      exit(1);
    }
    str += 2;
  }
}

int main() {
    uint8_t private[32] = {};
    uint8_t public[64] = {};
    int result;

    strtobytes("FFFFFFFF00000000FFFFFFFFFFFFFFFFBCE6FAADA7179E84F3B9CAC2FC63254F", private, 32);
    result = uECC_compute_public_key(private, public, uECC_secp256r1());
    if (result == 1) {
        vli_print("point.x", public, 32);
        vli_print("point.y", public + 32, 32);
    } else {
        printf("TEST 1 failed!\n");
    }

    strtobytes("FFFFFFFF00000000FFFFFFFFFFFFFFFFBCE6FAADA7179E84F3B9CAC2FC632550", private, 32);
    result = uECC_compute_public_key(private, public, uECC_secp256r1());
    if (result == 1) {
        vli_print("point.x", public, 32);
        vli_print("point.y", public + 32, 32);
    } else {
        printf("TEST 2 failed!\n");
    }
    return 0;
}

Both calls to uECC_compute_public_key fail, with uECC from commit https://github.com/kmackay/micro-ecc/commit/24c60e243580c7868f4334a1ba3123481fe1aa48.

What are the consequences of these failures, from a security perspective? Could it be used for example to leak the private key from a system using uECC to sign input data? Is this a security issue?

vt-alt commented 3 years ago

These two appear to be cases where k = base-point - 1 and base-point - 2

If I remember correctly, these are not fixable and special cases of underlying algorithm. (base_point-2 is an artifact of regularization?) And thus not a bug of implementation. For security, perhaps, you should check for these inputs and return them as invalid points.

markehack commented 2 years ago

Montgomery ladder Co-z will fail for -2,-1,0,1 from n for p256r1 f the keys are regularized and extended by 1 bit. Either you can reject these keys or if you like it better, do not regularize the keys, just the initial_z.

I did some simple timing attacks (note the caveat .. simple) and did not see any clearly measurable timing differences. Without the random Z, I did see some differences, but would need to test a lot more to see if these are even exploitable.

webbbitgit commented 1 year ago

Without the random Z, I did see some differences, but would need to test a lot more to see if these are even exploitable.

Interested to hear if additional testing yielded any new information.

markehack commented 1 year ago

I have some time available over the next few weeks to do more testing. This gets quite messy since part of regularizing is to extend the length by one bit and ensure that K n-1 = 1 as is required for the initial double. If you do not do that, then you have to find the first 1 bit and you expose yourself to timing attacks, or change the ladder calculations, which is a rather invasive change. (See
https://eprint.iacr.org/2010/309.pdf algorithm 7). Doing infinity avoidance is even more complicated. Just excluding the three points which fail is less invasive but I will have a shot at some alternatives. On Sat, 2022-12-10 at 19:50 -0800, Stephen Webb wrote:

Without the random Z, I did see some differences, but would need to test a lot more to see if these are even exploitable.

Interested to hear if additional testing yielded any new information.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: < @.***>