kalibera / rchk

103 stars 10 forks source link

Run tools with `/usr/bin/time -v` #38

Open krlmlr opened 2 days ago

krlmlr commented 2 days ago

I can't run duckdb on GitHub Actions, the process is being killed.

I'm now running in the cloud, and I'd like to understand how much memory I need.

Monitoring peak memory usage for process trees, especially when running Docker, is challenging. How do you feel about running the tools (like bcheck) with /usr/bin/time -v to display peak memory usage and other stats? Would that interfere with the logic?

https://github.com/kalibera/rchk/blob/8fe3f2769e834cf183577d5ddfa97f9a224fe612/scripts/check_package.sh#L104

kalibera commented 20 hours ago

I think that shouldn't be in the output as it would only be needed in rare cases (certainly nobody asked before). One could run the container interactively and explore there - if the check in the container finishes at some machine of yours, that might be a way to go (unless you just wanted to run several experiments with different limits). If you want to reduce the memory use, say in a custom build of the container, you can reduce the maximum number of states in the Makefile - then, functions that cannot be checked within that limit will be skipped (but possibly it will not reduce the amount of functions skipped for you, maybe the ones that are already being skipped will be skipped sooner). That is, in my experience, when some function couldn't be checked in the given amount of states, increasing the amount often didn't help.