Open Brechtpd opened 2 years ago
General overview of the idea behind data blobs: https://twitter.com/stonecoldpat0/status/1501895333181333507
From https://notes.ethereum.org/@vbuterin/proto_danksharding_faq:
However, blob data is not accessible to EVM execution; the EVM can only view a commitment to the blob.
Instead of using the KZG to represent the blob directly, EIP-4844 uses the versioned hash: a single 0x01 byte (representing the version) followed by the last 31 bytes of the SHA256 hash of the KZG.
There's an opcode, DATAHASH
that allows us to get the commitment to some blob of data: https://eips.ethereum.org/EIPS/eip-4844#opcode-to-get-versioned-hashes. So we do not have to prove that the commitment matches the data, we know this to be true because Ethereum ensures this.
The point evaluation precompile takes as input a versioned hash, an x coordinate, a y coordinate and a proof (the KZG commitment of the blob and a KZG proof-of-evaluation). It verifies the proof to check that P(x) = y, where P is the polynomial represented by the blob that has the given versioned hash. This precompile is intended to be used by ZK rollups.
Cheap way to get the evaluation point on L1 at some random point without exposing the original data to the smart contract.: https://eips.ethereum.org/EIPS/eip-4844#point-evaluation-precompile
ZK rollups would provide two commitments to their transaction or state delta data: the kzg in the blob and some commitment using whatever proof system the ZK rollup uses internally. They would use a commitment proof of equivalence protocol, using the point evaluation precompile, to prove that the kzg (which the protocol ensures points to available data) and the ZK rollup’s own commitment refer to the same data.
Then the scheme to use is the one described at https://notes.ethereum.org/@vbuterin/proto_danksharding_faq#Moderate-approach-works-with-any-ZK-SNARK
DATAHASH
opcode. We know this commitment commits to the data we want.x = hash (C1, C2)
.P(x) = y
using C1 on Ethereum.x
, y
and C2
to the circuits.y
at point x
. Because the values etc... are on the non-native curve, these are still non-native operations inside the circuit. This automatically exposes all the data so we have access to the data by using this method in the circuit. Because of these extra in-circuit calculations we have verified it to be the correct data corresponding to C1.Why I think this holds is explained here, both polynomial commitments evaluate to the some value as some random point and so are deemed to be same: https://notes.ethereum.org/@dankrad/kzg_commitments_in_proofs#The-trick:
The correctness of this scheme follows from the Schwarz-Zippel Lemma if the field is large enough (i.e. 256 bits for 128 bit security)
relevant stuff I also went through: https://dankradfeist.de/ethereum/2021/06/18/pcs-multiproofs.html
Figure out how to handle KZG commitments in a circuit.
Requirement for #22