IntersectMBO / cardano-ledger

The ledger implementation and specifications of the Cardano blockchain.
Apache License 2.0
261 stars 157 forks source link

add golden tests for `script_data_hash` #4517

Open michele-nuzzi opened 3 months ago

michele-nuzzi commented 3 months ago

since the introduction of PlutusV3 script_data_hash calculation for some downstream tools broke, including plu-ts.

I am querying v3 costs models directly from a fully synced cardano-node on sanchonet, as follows:

cardano-cli query protocol-parameters --$sancho | jq .costModels.PlutusV3

which gives me back the following array:

[100788,420,1,1,1000,173,0,1,1000,59957,4,1,11183,32,201305,8356,4,16000,100,16000,100,16000,100,16000,100,16000,100,16000,100,100,100,16000,100,94375,32,132994,32,61462,4,72010,178,0,1,22151,32,91189,769,4,2,85848,123203,7305,-900,1716,549,57,85848,0,1,1,1000,42921,4,2,24548,29498,38,1,898148,27279,1,51775,558,1,39184,1000,60594,1,141895,32,83150,32,15299,32,76049,1,13169,4,22100,10,28999,74,1,28999,74,1,43285,552,1,44749,541,1,33852,32,68246,32,72362,32,7243,32,7391,32,11546,32,85848,123203,7305,-900,1716,549,57,85848,0,1,90434,519,0,1,74433,32,85848,123203,7305,-900,1716,549,57,85848,0,1,1,85848,123203,7305,-900,1716,549,57,85848,0,1,955506,213312,0,2,270652,22588,4,1457325,64566,4,20467,1,4,0,141992,32,100788,420,1,1,81663,32,59498,32,20142,32,24588,32,20744,32,25933,32,24623,32,43053543,10,53384111,14333,10,43574283,26308,10,16000,100,16000,100,962335,18,2780678,6,442008,1,52538055,3756,18,267929,18,76433006,8868,18,52948122,18,1995836,36,3227919,12,901022,1,166917843,4307,36,284546,36,158221314,26549,36,74698472,36,333849714,1,254006273,72,2174038,72,2261318,64571,4,207616,8310,4,1293828,28716,63,0,1,1006041,43623,251,0,1]

After checking with the plutus team I'm confident in saying that these parameters are preserved.

I have a simple transaction, with a single redeemer and no datums;

The only redeemer is the following:

{
    "tag": "Spend",
    "index": 0,
    "execUnits": {
        "steps": "1922100",
        "memory": "10500"
    },
    "data": {
        "int": "0"
    }
}

with conway allowing for multiple representations of the redeemer witnesses, I'm testing for both, even though I am aware that the intended representation should be "as-is" from the cbor, the tool used to build the transaction is the same used for testing.

The expected hash, part of the error message is:

9a0726be60d68003c541953ded2489526eedc607cd07622c68366a64c264038b

yet nothing seems to get me the same result:

exhibit 1: redeemers as map

input data:

a18200008200821929041a001d5434a10298fb1a000189b41901a401011903e818ad00011903e819ea350401192baf18201a000312591920a404193e801864193e801864193e801864193e801864193e801864193e80186418641864193e8018641a000170a718201a00020782182019f016041a0001194a18b2000119568718201a0001643519030104021a00014f581a0001e143191c893903831906b419022518391a00014f580001011903e819a7a90402195fe419733a1826011a000db464196a8f0119ca3f19022e011999101903e819ecb2011a00022a4718201a000144ce1820193bc318201a0001291101193371041956540a197147184a01197147184a0119a9151902280119aecd19021d0119843c18201a00010a9618201a00011aaa1820191c4b1820191cdf1820192d1a18201a00014f581a0001e143191c893903831906b419022518391a00014f5800011a0001614219020700011a000122c118201a00014f581a0001e143191c893903831906b419022518391a00014f580001011a00014f581a0001e143191c893903831906b419022518391a00014f5800011a000e94721a0003414000021a0004213c19583c041a00163cad19fc3604194ff30104001a00022aa818201a000189b41901a401011a00013eff182019e86a1820194eae182019600c1820195108182019654d182019602f18201a0290f1e70a1a032e93af1937fd0a1a0298e40b1966c40a193e801864193e8018641a000eaf1f121a002a6e06061a0006be98011a0321aac7190eac121a00041699121a048e466e1922a4121a0327ec9a121a001e743c18241a0031410f0c1a000dbf9e011a09f2f6d31910d318241a0004578218241a096e44021967b518241a0473cee818241a13e62472011a0f23d40118481a00212c5618481a0022814619fc3b041a00032b00192076041a0013be0419702c183f00011a000f59d919aa6718fb0001

resulting hash:

3faebd87a3f87898331d155f146aeaa5c5ce1815285f47d3fdb57d73f36bd861

exhibit 2: redeemers as array

8184000000821929041a001d5434a10298fb1a000189b41901a401011903e818ad00011903e819ea350401192baf18201a000312591920a404193e801864193e801864193e801864193e801864193e801864193e80186418641864193e8018641a000170a718201a00020782182019f016041a0001194a18b2000119568718201a0001643519030104021a00014f581a0001e143191c893903831906b419022518391a00014f580001011903e819a7a90402195fe419733a1826011a000db464196a8f0119ca3f19022e011999101903e819ecb2011a00022a4718201a000144ce1820193bc318201a0001291101193371041956540a197147184a01197147184a0119a9151902280119aecd19021d0119843c18201a00010a9618201a00011aaa1820191c4b1820191cdf1820192d1a18201a00014f581a0001e143191c893903831906b419022518391a00014f5800011a0001614219020700011a000122c118201a00014f581a0001e143191c893903831906b419022518391a00014f580001011a00014f581a0001e143191c893903831906b419022518391a00014f5800011a000e94721a0003414000021a0004213c19583c041a00163cad19fc3604194ff30104001a00022aa818201a000189b41901a401011a00013eff182019e86a1820194eae182019600c1820195108182019654d182019602f18201a0290f1e70a1a032e93af1937fd0a1a0298e40b1966c40a193e801864193e8018641a000eaf1f121a002a6e06061a0006be98011a0321aac7190eac121a00041699121a048e466e1922a4121a0327ec9a121a001e743c18241a0031410f0c1a000dbf9e011a09f2f6d31910d318241a0004578218241a096e44021967b518241a0473cee818241a13e62472011a0f23d40118481a00212c5618481a0022814619fc3b041a00032b00192076041a0013be0419702c183f00011a000f59d919aa6718fb0001

resulting hash:

06c087dd394976edf9806e981ddec70f6f6aba24b63db9b5061cd440b7f3bd82
michele-nuzzi commented 3 months ago

additionally to golden tests, since this issue is likely to re-appear for each time new builtins are added, or some costs modified It would be great if the error message could report the input data used to get to the final, expected hash

michele-nuzzi commented 3 months ago

@lehins tagging you to get this under your radar

This is a major blocker on my side, but of course, I understand there is a lot of stuff going on other than this.

lehins commented 3 months ago

Script hash mechanism has not changed in Conway, so I am not sure what exactly is the problem that you are experiencing. Worth noting that it is not relevant how the redeemers are represented, we use the original bytes that were submitted over the wire for script integrity hash computation.

additionally to golden tests, since this issue is likely to re-appear for each time new builtins are added, or some costs modified

Golden tests are not gonna help in this case, since as I mentioned algorithm has not changed, while cost models can change at any point. Are you sure you are using the correct cost model for computing the hash?

It would be great if the error message could report the input data used to get to the final, expected hash.

This is definitely a good idea to report original bytes that where used to compute the correct script integrity hash, since that would make debugging issues like that much easier. Unfortunately we won't be able to add this feature until the next era or at the earliest the next intra-era hard fork, since we can't change predicate failures at an arbitrary point.

If you include the offending transaction as hex encoded cbor, I could assist a little better, until then I can't really tell what is giving you trouble.