I'm trying to check the memory consumption of a very fast program, for which the sampling interval is a bit too high. I think I'm talking about this line here:
Basically, my program finishes in about 0.00111576837329873 seconds.
Besides the first line, my .dat file contains these lines:
MEM 1.761719 1586966060.7232
MEM 25.882812 1586966060.8235
MEM 48.011719 1586966060.9238
This creates a pretty decent chart. However, can I be sure that these three samples can accurately represent the memory consumption, even towards the end? I kind of wish that there were more samples.
It would be great to be able to play around with different intervals to find kind of a sweet spot. The current interval could be kept as a default value!
Hey there!
I'm trying to check the memory consumption of a very fast program, for which the sampling interval is a bit too high. I think I'm talking about this line here:
https://github.com/pythonprofilers/memory_profiler/blob/8a8a40252cccc09dc469445596742dc6b47ed6e3/memory_profiler.py#L239
Basically, my program finishes in about 0.00111576837329873 seconds. Besides the first line, my .dat file contains these lines:
This creates a pretty decent chart. However, can I be sure that these three samples can accurately represent the memory consumption, even towards the end? I kind of wish that there were more samples. It would be great to be able to play around with different intervals to find kind of a sweet spot. The current interval could be kept as a default value!