The code defines a relax function that allocates a tensor using R.builtin.alloc_tensor inside a dataflow block. When applying the StaticPlanBlockMemory transformation, it results in an internal error with the message: Check failed: (!block_stack_.empty()) is false. However, if the dataflow block is removed, the transformation completes without error.
Expected behavior
The StaticPlanBlockMemory transformation should handle functions with dataflow blocks correctly, allowing the memory planning without causing internal errors.
Actual behavior
File "/software/tvm/src/relax/transform/static_plan_block_memory.cc", line 597
InternalError: Check failed: (!block_stack_.empty()) is false:
Steps to reproduce
import tvm
from tvm import relax
from tvm.script import ir as I
from tvm.script import relax as R
@I.ir_module
class Module:
@R.function
def main() -> R.Tensor((10,), dtype="float32"):
with R.dataflow():
gv: R.Tensor((10,), dtype="float32") = R.builtin.alloc_tensor(
R.shape([10]), R.dtype("float32"), R.prim_value(0), R.str("global")
)
R.output(gv)
return gv
mod = Module
mod_seq = tvm.transform.Sequential([relax.transform.StaticPlanBlockMemory()])(mod)
This issue may indicate a problem with how dataflow blocks are processed within the StaticPlanBlockMemory pass. Any guidance or fixes to resolve this would be appreciated.
The code defines a relax function that allocates a tensor using R.builtin.alloc_tensor inside a dataflow block. When applying the StaticPlanBlockMemory transformation, it results in an internal error with the message:
Check failed: (!block_stack_.empty()) is false
. However, if the dataflow block is removed, the transformation completes without error.Expected behavior
The
StaticPlanBlockMemory
transformation should handle functions with dataflow blocks correctly, allowing the memory planning without causing internal errors.Actual behavior
Steps to reproduce
This issue may indicate a problem with how dataflow blocks are processed within the StaticPlanBlockMemory pass. Any guidance or fixes to resolve this would be appreciated.