Closed AlexHentschel closed 4 months ago
for updates on this work, see the Core Protocol Working Group: Tuning Flow's Main Consensus. For the technically detailed analysis, please see our Proposed parameter setting (notion)
all work has been merged and is in the process of being tested on previewnet
Context
In a long-running work, Tarak has replaced the previously used
relic
cryptography library withBLST
. As a result we are seeing notable block-rate increases in model networks. At the moment, it is unclear to which extend these speedups translate to Flow mainnet. Nevertheless, we should see this as an opportunity for improving mainnet performance.Overview of required work
Step 1: Determine block delay that Cruise control injects before the HCU (using the prior
relic
crypto stack)This can be done using the consensus telemetry:
TelemetryConsumer
'sOnOwnProposal
method ingests theOnOwnProposal
notification from theEventHandler
exactly the same way as theMessageHub
, which applies the delay before broadcasting.We can collect the needed data from the Consensus Telemetry via the following Loki query:
We can use the logging
time
stamp as an accurate approximator for the time when theMessageHub
receives the proposal. Hence, the temporal delay $d$ that theMessageHub
waits before broadcasting is $$d = \max (\texttt{targetPublicationTime} - \texttt{time}, 0)$$ The $\max$ function is important, because thetargetPublicationTime
might be earlier than the current time. In this case, theMessageHub
broadcasts immediately, i.e. $d$ is zero.Step 2: Determine block delay that Cruise control injects after the HCU (using
BLST
crypto stack)Step 3: Data analysis
see
Cruise-Control headroom for speedups
(notion)Step 4: Adjust number of views per Epoch in system smart contract controlling Epoch
see Cruise-Control headroom for speedups (notion) for details how to compute the number of views $V'$ that we can maximally choose without reducing the temporal buffer to compensate for slow or offline nodes below the current buffer with the relic stack.