I was attempting to use mprof to benchmark some algorithmic code and realized for
some of my tests, the setup was using more memory than the code itself, so I couldn't
get an accurate measurement using mprof peak alone. Therefore, I introduced the
ability to filter by function to help isolate the effect.
This doesn't support multiprocessing (because the .dat doesn't have entries
for FUNC for child processes as far as I know. Hope this is a good contribution
to mprof!
Filter the peak memory usage by function. This doesn't isolate the
contribution of the function by itself, but it screens out noise
from setting up the test so an accurate result can be obtained
by mprof peak if the test is otherwise well isolated.
Example usage:
mprof peak --func some.function file.dat
The function name is the fully qualified python import name.
You can also figure it out by grepping the dat file for FUNC
entries.
Hi,
I was attempting to use mprof to benchmark some algorithmic code and realized for some of my tests, the setup was using more memory than the code itself, so I couldn't get an accurate measurement using
mprof peak
alone. Therefore, I introduced the ability to filter by function to help isolate the effect.This doesn't support multiprocessing (because the
.dat
doesn't have entries forFUNC
for child processes as far as I know. Hope this is a good contribution to mprof!Filter the peak memory usage by function. This doesn't isolate the contribution of the function by itself, but it screens out noise from setting up the test so an accurate result can be obtained by mprof peak if the test is otherwise well isolated.
Example usage:
The function name is the fully qualified python import name. You can also figure it out by grepping the dat file for
FUNC
entries.