Open michael-schwarz opened 1 month ago
How large are the dump files themselves? It could either be that the input files are already overly massive or that the comparison somehow allocates too much something.
They are ~800 MB each.
That's quite significant, although I don't remember how large they were at the time (maybe I still have some laying around). There isn't much that can be done about it though I guess. We might've also gotten more precise which leads to more constraints in the output.
Is there any output over this comparison time or not? I wonder if there's slowdown due to just printing large states (e.g. in Pretty
/Format
) or it blows up before that point is reached.
Is there any output over this comparison time or not?
There's no output at this point.
Does not terminate after more than 35h and gobbling up some 40GB of RAM, even though the individual analyses all terminated within 15min.