Open jessicadaugherty opened 2 years ago
Implement random leader election proportional to the Validator's total stake that is not susceptible to grinding attacks, while also productionizing the leader election process.
The algorithm for HotPOKT leader election algorithm can be found in the consensus specification. It is based on Section 5.1 in the Algorand whitepaper.
Section 5.1
The leader election package in this repository implements most of the core business logic as a separate library to enable this.
The current leader election algorithm uses round robin defined in consensus/helpers.go:
consensus/helpers.go
func (m *consensusModule) electNextLeader(message *typesCons.HotstuffMessage) error { leaderId, err := m.leaderElectionMod.ElectNextLeader(message) if err != nil || leaderId == 0 { // ... }
leader_election
grep -rl "TODO" ./consensus/leader_election
make test_vrf
make test_sortition
make test_all
LocalNet
Creator: @jessicadaugherty Co-Owners: @Olshansk
@Olshansk define ty
@gokutheengineer updated AC for review. Ty!
Objective
Implement random leader election proportional to the Validator's total stake that is not susceptible to grinding attacks, while also productionizing the leader election process.
Origin Document
The algorithm for HotPOKT leader election algorithm can be found in the consensus specification. It is based on
Section 5.1
in the Algorand whitepaper.The leader election package in this repository implements most of the core business logic as a separate library to enable this.
The current leader election algorithm uses round robin defined in
consensus/helpers.go
:Goals
Deliverable
leader_election
library in place of the current "round robin" leader election processgrep -rl "TODO" ./consensus/leader_election
and:Non-goals / Non-deliverables
General issue deliverables
Testing Methodology
make test_vrf
# Update if neededmake test_sortition
# Update if neededmake test_all
LocalNet
is still functioning correctly by following the instructions at docs/development/README.mdCreator: @jessicadaugherty Co-Owners: @Olshansk