For now we will just have the profiling outputs stored on the repository and use GitHub web interface as simple way for viewing summaries, while also allowing repository to be cloned locally to perform more detailed analysis
Any Python scripts for analyzing profiling outputs across commits / time can be kept in new repository. At some point we can potentially move to automating running these with workflows within the repository.
5 provides the framework to do this. We just need to write the appropriate files as an output of the profiling workflow on the TLOModel repo, and update the relevant build script.
See master issue on TLOModel repo: https://github.com/UCL/TLOmodel/issues/686#issuecomment-1620514010
For now we will just have the profiling outputs stored on the repository and use GitHub web interface as simple way for viewing summaries, while also allowing repository to be cloned locally to perform more detailed analysis