Closed niklasdewally closed 2 months ago
@PedroGGBM this should provide the basic structure for what you need.
I will add minion node count sometime tomorrow, this will be called nodeCount within the solver runs section of the json
Will look something like this for each test.
{
"solver_runs": [
{
"conjure_solver_wall_time_s": 0.00949225
"node_count": TODO
}
]
}
I've also made the context serializable so that you can have things like the file name. @PedroGGBM
{
"extra_rule_set_names": [],
"file_name": "tests/integration/basic/bool/01/bool-01.essence",
"stats": {
"solver_runs": [
{
"conjure_solver_wall_time_s": 0.004314042,
"node_count": TODO
}
]
}
}
lines......: 73.3% (4328 of 5907 lines)
functions..: 50.3% (456 of 907 functions)
branches...: no data found
lines......: 73.2% (4285 of 5856 lines)
functions..: 50.2% (451 of 898 functions)
branches...: no data found
Based on #278.
Add global and solver specific stats objects and save these to {filename}-stats.json for each integration test.
This provides the necessary json and infrastructure for the performance monitoring work for @PedroGGBM.
Adding Minion node count using this will be in the next PR.
See commit log for details.