Open zeryl opened 3 years ago
If you've got a dump that is that kind of size, you probably want to rerun it with SPX_SAMPLING_PERIOD
set to something reasonably high.
That should dramatically reduce the size of the file you're generating and give you a chance of getting the first optimisations done.
The alternative is to potentially use the spx_profiler_start()
functions to profile a subset of your script rather than trying to view it all at once.
In fact, you probably want to make use of both of these simultaneously if you've ended up with a that kind of behemoth 😆
Howdy! I have a VERY large call graph (likely bad code, recurrsion, etc), but have no way to view it. It's something like 92GB gzipped.
Is there anything that can be done, whether a C program, parsing, etc, to get even a CLI level profile out of this, so I can get something to see what's going on?