This PR compiles the architecture refactoring of the crawler tool. The main purpose of this refactor is to remove any dependency that we might have from Rumor, building the crawler tool directly from the official modules such as, libp2p, go-ethereum, zrnt, etc...
An important key of this new refactor is to distinguish each of the different modules that take part in a crawler. Network crawlers among different networks, even blockchains, need from the same modules (peer discovering, peering service, DB, etc...), therefore, looking for a network/blockchain agnostic tool should be the final goal. This tool then, will be able to replace each of these basic modules with the one specific to that blockchain.
The first draft of the tool's architecture is resumed as follows:
The structure of the tool will follow the cobra go tool. Where the source code will be separated from the commands of the client.
repo:
- cmd/ -> crawler
-> server (for the future)
-> ...
- src/ -> base (base contexts and private loggers for each of the modules)
-> discovery (peer discovering modules)
-> metrics
-> blockchain-node/enode (this first draft focusses on Ethereum)
-> message-propagation/gossipsub (message propagation protocols)
-> RPC
-> utils
-> prometheus (real-time data export)
In addition to the refactor, we would like also to start documenting the tool in a more serious way. This way, tools like go doc can easily generate formatted documentation of each tool release.
The structure and the code are still a draft, please keep in mind that it might change over time. And remember that any feedback regarding, testing, structure, best practices, alternatives, etc... are really appreciated 👌🏽
This PR compiles the architecture refactoring of the crawler tool. The main purpose of this refactor is to remove any dependency that we might have from Rumor, building the crawler tool directly from the official modules such as, libp2p, go-ethereum, zrnt, etc...
An important key of this new refactor is to distinguish each of the different modules that take part in a crawler. Network crawlers among different networks, even blockchains, need from the same modules (peer discovering, peering service, DB, etc...), therefore, looking for a network/blockchain agnostic tool should be the final goal. This tool then, will be able to replace each of these basic modules with the one specific to that blockchain.
The first draft of the tool's architecture is resumed as follows:
For more info regarding the current structure, please take a look at the integral-refactor/README.md.
In addition to the refactor, we would like also to start documenting the tool in a more serious way. This way, tools like
go doc
can easily generate formatted documentation of each tool release.The structure and the code are still a draft, please keep in mind that it might change over time. And remember that any feedback regarding, testing, structure, best practices, alternatives, etc... are really appreciated 👌🏽