This was particularly bad for pipelines with lots of inputs or outputs,
because those pipelines have lots of asserts, which make for lots of
facts to substitute in.
Speeds up lowering of local laplacian with 20 pyramid levels (which has
only one input and one output) by 1.09x
Speeds up lowering of the adams 2019 cost model training pipeline (lots
of weight inputs and lots outputs due to derivatives) by 1.5x
Speeds up resnet50 (tons of weight inputs) lowering by 7.3x!
It was O(n) for n facts. This makes it O(log(n))
This was particularly bad for pipelines with lots of inputs or outputs, because those pipelines have lots of asserts, which make for lots of facts to substitute in.
Speeds up lowering of local laplacian with 20 pyramid levels (which has only one input and one output) by 1.09x
Speeds up lowering of the adams 2019 cost model training pipeline (lots of weight inputs and lots outputs due to derivatives) by 1.5x
Speeds up resnet50 (tons of weight inputs) lowering by 7.3x!