Open jn1z opened 1 week ago
Thank you for the issue and reproducible example. We will have a look when we have some time. We are aware that sometimes there are issues with Z3-Noodler failing on memory out error caused by Mata when running benchmarks. Sometimes it is to be expected due to the nature of the automata worked with and operations applied on them, but it is possible that in this case there is a bug somewhere. It might take us some time to get the time to work in this issue, however.
Thank you. This appears to be relatively common, even outside of Z3-Noodler. automata_bench/ltl_afa/created_ltl/LTLf-specific/RespondedExistence/N510.mata takes more than 20GB (terminating early)
In this case, the file is only 7 lines long (only 4 lines describe states). There are a lot of variables being used, but they should collapse dramatically.
Thank you for the examples. It is busy for us at the moment, so we have not had the change to inspect this issue. I believe there is an issue with our parser (we are long due a full rewrite of our too general and abstract parser). We believe most of the similar issues will be solved once the parser is rewritten. If these examples are not the case, a more thorough review of this issue will have to follow (or precede) the rewrite.
Thank you for the responses. Yes, I think this issue is more prevalent than I originally realized: more than 50% of the files are failing. I hit an additional failure which I'll log separately as well.
I'm attempting to mintermize some AFA-bits files from automata-bench. I've been successful for several, but for noodler/str_2/quad-002-1-unsat/xx17064.mata and others, I go somewhere past 45GB of memory.
Maybe this is expected? But it doesn't seem like it's impossibly complex; if I understand the math of this operation for this file, there should be at most 2^25 labels, of which I assume a lot would collapse.
build/examples/example06-mintermization automata-bench/noodler/str_2/quad-002-1-unsat/xx17064.mata