Closed hackfisher closed 3 years ago
Base on parachain or darwinia chain? determines which state needs to be validated.
BEEFY will produce merkle mountain range (MMR) roots signed by validators that commit to both new Polkadot Relay chain blocks and new parachain blocks in a form that is cheaply verifiable on Ethereum. Following these commitments will effectively allow us to follow both (1) new Polkadot Relay chain block headers and (2) new Parachain block headers.
Commitment scheme for committing to bridge messages on parachain and placing those commitments into parachain header along with a custom light client for parachain that runs on Ethereum.
struct Message {
address target;
uint64 nonce;
bytes payload;
}
keccak256(abi.encode(_messages)) == _commitment
Verify BEEFY MMR which are produced and signed by the correct set of validators.
MmrLeaf {
block_number: int
parent_hash: frame_system::Module::<T>::leaf_data(),
parachain_heads: Module::<T>::parachain_heads_merkle_root(),
beefy_authority_set: Module::<T>::beefy_authority_set_merkle_root(),
}
The interactive protocol:
MerkleMountRangeRootHash
CurrentBlockHash
ecrecover
the signatures and make sure the public keys match the ones we got and the merkle
proof is valid.
After the interactive protocol runs, we have new BEEFY MMR commitments. These are the root hashes of merkle mountain ranges that contain data for updates to the Validator set and data for new parachain headers.
Lastly, with these verified parachain blocks, Parachain light client that uses Parachain Commitments to verify individual bridge messages. Messages could simply to the hash of abi.encode(_messages)
(1) beefy payload (which is an MMRRoot) --contains-->
(2) Leaves corresponding to new relay chain blocks --contains-->
(3) Single, specific relay chain leaf at index for specific relay chain block --contains-->
(4) hash of all parachain headers in that relay chain block --contains-->
(5) parachain header for our specific parachain at specific block --contains-->
(6) _commitment in our parachain at that block ->
(7) _messages in that commitment
- Authority Set is changed each epoch, Should we need to import BEEFY MMR each epoch or on demand?
Authority Set(Validator Set) change per epoch. Means to BEEFY keys change per epoch. BEEFY light client is expected to import at least one commitment per epoch, light client should be followed up the validator set updates to keep the latest validator set. So should relayer submit commitment to the bsc light client smart contract per epoch, or at least submit the commitment when validator set changes, right? @HackFisher @wuminzhe @AurevoirXavier
Resources: https://github.com/Snowfork/polkadot-ethereum/projects/1 https://github.com/paritytech/parity-bridges-common/issues/323 https://github.com/svyatonik/substrate-bridge-sol/blob/master/substrate-bridge.sol https://github.com/paritytech/grandpa-bridge-gadget/blob/master/docs/beefy.md#tldr--rationale https://hackmd.io/ohOt4jAPT8uu-soJXHUq0Q
Some design update:
Custom Commitment for messages in header, this optimization in Substrate side will help reduce the gas cost on Ethereum-like chain side. We are good to have this. cc @wuminzhe @AurevoirXavier Reference: https://github.com/Snowfork/polkadot-ethereum/tree/main/parachain/pallets/commitments
Interactive grandpa justification verification, we are good to have this. Refer: https://hackmd.io/ohOt4jAPT8uu-soJXHUq0Q https://github.com/paritytech/grandpa-bridge-gadget/blob/master/docs/beefy.md#tldr--rationale
Message Verification Design, in latest substrate to substrate bridge, the messages in substrate side are not deleted until header relay, so the Header MMR and mmr verification in BSC/Ethereum side can be removed to save gas. We are good to have this for now. This is different with current production implementation. Refer: Substrate 2 substrate bridge implementation
Following 3, now it is unclear how Header MMR will still be used, and MMR is also mention in parity-bridge-common/Snowfork for Lane state and authority set, usage is not clear too. MMR are good to have, but it seems that we are not clear about the must use cases, we can take it as optional for now.