Open jakmeier opened 1 year ago
@marcelo-gonzalez which of the above steps do you think could be automated with reasonable amount of work? And do you have capacity to work on (parts of) this task? (can I assign it to you?) Or should I look into it / parts of it?
@marcelo-gonzalez which of the above steps do you think could be automated with reasonable amount of work? And do you have capacity to work on (parts of) this task? (can I assign it to you?) Or should I look into it / parts of it?
@jakmeier yeah feel free to assign to me! I think all of them should be possible to automate, and I've already done some work around that, so I should be able to integrate this into the mocknet test setup. Basically what we could do is to take the current mocknet test here: https://github.com/near/nearcore/blob/master/pytest/tests/mocknet/mirror.py (which we're going to be changing up a bit), and modify it so the user can specify whether to run with mirrored mainnet traffic or locust traffic (or maybe both). A lot of the set-up steps should be shared I think
Would be useful to also generate some nice grafana/prometheus graphs as part of the loadtests so that we can alarm on them for each new release of the code. Example code that may be useful
@marcelo-gonzalez it looks like the engineers working on storage will want to use this setup soon (maybe next week?) to test background fetching performance.
How far are we with automation? Would you be able to give some instructions on how to setup a testnet fork for such tests? (Assuming it's no longer as painful as described in https://github.com/near/nearcore/issues/9149#issuecomment-1587630570)
The type of benchmark ran in #9149 should be easy to repeat with as little manual work as possible.
Currently manual work that could be automated: