usnistgov / ACVP-Server

A repository tracking releases of NIST's ACVP server. See www.github.com/usnistgov/ACVP for the protocol.
39 stars 14 forks source link

KDF108-KMAC generating incorrect vectors #274

Closed RandallSteck closed 12 months ago

RandallSteck commented 1 year ago

environment Only run on Demo thus far

testSessionId Several, including 409483

vsId Several, including 1672706 (from session 409483)

Algorithm registration

[
    {
        "algorithm": "KDF",
        "mode": "KMAC",
        "revision": "Sp800-108r1",
        "prereqVals": [
            {
                "algorithm": "KMAC",
                "valValue": "same"
            }
        ],
        "macMode": [
            "KMAC-128",
            "KMAC-256"
        ],
        "keyDerivationKeyLength": [
            {
                "min": 112,
                "max": 4096,
                "increment": 8
            }
        ],
        "contextLength": [
            {
                "min": 8,
                "max": 4096,
                "increment": 8
            }
        ],
        "labelLength": [
            {
                "min": 8,
                "max": 4096,
                "increment": 8
            }
        ],
        "derivedKeyLength": [
            {
                "min": 112,
                "max": 4096,
                "increment": 8
            }
        ]
    }
]

Edit: Add 'code' markdown for formatting

Endpoint in which the error is experienced N/A

Expected behavior Different 'expected' values; different pass/fail results

Additional context

I believe I have detected that the ACVP-Server is generating incorrect 'expected' values on a small number of KDF108-KMAC test vectors.

I am adding KDF108-KMAC support to libacvp with OpenSSL 3.1.1. For about 70% of vectors sets, I get fully passing results. For the remaining 30%, I get one single failing result from ~1000 test cases. Over the course of several runs, I was able to accumulate 4 failing test cases and their expected results. I have today confirmed that Bouncy Castle (v1.75, latest) also generates "incorrect" derived keys that exactly match those produced by OpenSSL for all 4 test cases.

I have provided one example of a failed vector below, which I believe is from the session/vsId 409843/1672706 referenced above. I can provide 3 additional vectors. but not their corresponding vsIds. I can also generated new vsId on new session until I encounter more failures.

    {
        "tgId": 1,
        "testType": "AFT",
        "macMode": "KMAC-128",
        "tests": [
            {
                "tcId": 22,
                "keyDerivationKey": "95D5C56E24939009729CC1AECB1FF50F1C614D637BAA0E0286DE614194E627E94792ADD115347744DF64C6E5E30AD4F9AE7C61EC89C22843A7DBA39BD7706D3981A5B99F19A48C7D4ED726A3EEB6DD297939B20120A88E0032C676A2B2C51B0507B59BC1344D32607E819518C35A0A1D7D5660F27674EE1C67EDED4B040AFFE3F0E9FDF8D45B80DA2B28760FC5F5EE1F127AACB2A90BEF04AD6F1DB4285E89F4915A0EB1791C982A080AFE65799C8C84A215AA8EFF903EFE9B77146D8589E42A8DF0AB51799035BC09AB43E70A8A7CC2DCA23D0C8917E53CAC8842AED9A2D64CF108EFF65E6826E02233A5E4CDEB86DD26EE0F5BD5D6337DE609BAA07057F0AE9010D29068A2B0EBFE13E2BC94CC466728B306C95066",
                "context": "BB5D083A78E1CE07FABF550E9D152056C5DF2AD4888927A09BADE9E515383C17B9CEB18A738B2898C78DDB061815F67BE81CD6E184F6976CE87BC5A07A95",
                "label": "A9DC64EF217825390EDC87C7A8C572A94FC1A0E314C1A98CAF24441C49651E9B0DD77A4BC8AF2D13A5162725F79F55BEDDE11FE68CA90023272D9E4E8BCA5035017F67FE834F8B6EA82D8D15E5D1B94E73CFE79A9A201B75D6FD9B541DD540322C0CF6B1EEC88DC47BBA06A71FD620C381BC23CBFD7C42F83156C710140628AED5ACEC06943A0A969BE516DC184CCC9E0EAF5A468415DF18128AECF95F29A991971FF4EB752F73DB02A7E1DD9454821D50273ABE97BFCEC998613507160D3E7CC2AB319B691CC5486EF20552DE7879489203069EA56C3CC8",
                "derivedKeyLength": 112
            }
        ]
    },

ACVP Expected: "9FBAD534ECF14A21276E67A9899A"

Library results: "C18975F7433E18588BAC69C0F2A0"

RandallSteck commented 1 year ago

Attaching tarball with 3 files -- request, response, and expected -- that contain all 4 test cases. First test case (KMAC-128) is the test case presented in the initial issue post.

kdf108-kmac-failures.tar.gz

jbrock24 commented 1 year ago

Hi @RandallSteck, I'll look into this for you, thanks.

celic commented 1 year ago

Thank you for the information. This is very peculiar because KMAC-KDF is just a function that passes the inputs to KMAC. We've had no issues reported on KMAC. I re-ran the test cases you mentioned directly through our KMAC-KDF operation and got results that matched yours. So why did the server get a different answer? That makes this interesting.

celic commented 1 year ago

Unfortunately since vsID 1672706 has expired we no longer have the internal files to take a look. Would you be able to re-run this or send us an active vsID where this issue occurs?

RandallSteck commented 1 year ago

I am OoO until Wednesday, but I can do this on my return. I will produce new vectors sets until I generate a failure and report the those IDs here.

Thank you for looking into this.

Randy Steck

Sent from my iPhone

On Jul 10, 2023, at 1:59 PM, Chris Celi @.***> wrote:

 Unfortunately since vsID 1672706 has expired we no longer have the internal files to take a look. Would you be able to re-run this or send us an active vsID where this issue occurs?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.

RandallSteck commented 1 year ago

I've generated new failed values. The byte strings are the values calculated by OpenSSL and confirmed with BC.

SessionID / vsID / tcID: Calculated value

422039 / 1724729 / 67: 7491E29582EDFA0C7FE674C18551760FB197E5F052812AE10399EAA38A9E3A607EF8BAA532041EAC7DF3359FA5FFB0990A8AAD1AD65746CE20C09E32CE76BFF4662ACA621A8D0F2EF2610EA69060A4A88005779C563E4E585EE4DBA147225BC8B2036960B88EB27F66B3CD3E7D48F366283507B6CBD496CEC9B47B7A78807864E40AC63298441CAB57BF568FD50E04A271585FE1153E07FEC6F1C94CAE7A8D0AE5B81427BF4B13B2B2775F55732674878347E630B5925800816DC8F1C86EB204287605D2D3840FF0683C7444F2604C3919BAD7EB9798C9FDE1EBDB34CCC9A8B6A36AAF72D948CB187FB6315E2D4F3EEFDE60C0

422025 / 1724745 / 91: DDCBC1C0B39258EFF130E670AD8ABE59DE93B4CA52F78D65C5AB869E2D1D1584894152D01344BA0A3E003596C31B0FBF3F66D4070AF333CFD1993FD7C8340678F96B4CDE54FA7EA21144AE17E56E7D82EED61E92571E9892502251F8C418E857562B1A4579F483AF77578A3A0AE2CB0FE48C3219C2CEC8BFEBE5A1F99E5992479F96A2

celic commented 1 year ago

Hi @RandallSteck. I think we've got a fix for this. The issue is, this isn't something we can reproduce internally. We verified that our crypto implementation is correct, and so these sporadic inconsistencies is really curious. My guess is that this has to do with our cluster that we use to fulfill crypto requests. There is a very small chance that repeated KMAC-KDF requests could trigger one of the KMAC instances to access the same state before finishing a previous request for KMAC-KDF specifically. This would certainly lead to an incorrect value to be generated. This had no bearing on KMAC individually because that was set up to correctly handle the state between requests. Only KMAC-KDF was set up to not handle the state consistently. With how the cluster is set up, it would only be a small chance that the state from a previous request would affect another. This would explain the sporadic nature of the wrong values.

We have a fix going through our pipeline we plan on releasing to Demo soon. It would be great if you could make as many requests to KMAC-KDF as possible to ensure this is fixed. Due to some differences in the clustering set up between developer environments and the deployed environments, I'm not sure if this is something we could reproduce locally to test fully ourselves. The fix is my attempt at saving us a weeks worth of work just trying to duplicate the issue.

livebe01 commented 1 year ago

Hi @RandallSteck, fyi that the fix Chris mentioned above has been deployed to Demo.

RandallSteck commented 1 year ago

I will hit it as hard as I can today (Fri 7/14). I only have one set of creds, so with TOTP, I can probably run a max of 2 concurrent sessions requesting and running vectors. I’ll post my results around EOD with session/vs IDs if I detect any that contain failed vectors.

Thanks! Randy

On Jul 13, 2023, at 3:59 PM, livebe01 @.***> wrote:

 Hi @RandallSteck, fyi that the fix Chris mentioned above has been deployed to Demo.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.

celic commented 1 year ago

FYI you can put any number of the same algorithm in a single registration. You could register for 1000 KDF-KMACs in the same test session. Please don't go to that extreme, but this may help with TOTP issues.

RandallSteck commented 1 year ago

FYI you can put any number of the same algorithm in a single registration.

TIL. With this knowledge...

This evening I generated more than 500 vector sets between 7:45-8:05pm ET. All sets passed. Things look good from my end, but I'll let your internal process judge whether the issue is closed.

Thanks for addressing this issue so quickly!

Randy Steck ThinqSoft www.thinqsoft.com

celic commented 1 year ago

Thanks for confirming for us. It's a big help.

livebe01 commented 1 year ago

The fix for this issue is on Demo in release v1.1.0.30.

livebe01 commented 12 months ago

The fix for this is on Prod in release v1.1.0.30