Open LL-Math opened 3 years ago
Maybe you need to configure the search time? I think they have much longer search time in their paper than the default config in this repo.
I will let my colleague to address the first question. For the second one, @Z-Y00 , you are right the parameters we used to generate those orders are different from those in the default config file, one can always change parameters to those we reported in the paper and regenerate our orders though.
But @LL-Math 's question is more about his estimated cost of the reported order is higher than what we reported in arXiv:2005.06787. The caveat is that opt_einsum
assume REAL FLOPs which in this case is 2x the actual 'contraction cost' reported in our paper. This issue was first mentioned in arXiv: 2002.01935, "assuming every contraction is an inner product, for real (complex) tensors, the associated FLOP count will be a factor of two (eight) times more than the actual contraction cost. "
Therefore, when report the actual contraction cost, we should divide the reported order.cost by 2. I hope this has addressed your concerns.
I will note that usually the first subtask takes more times than the rest: The JAX backend of einsum performs lazy compilation, which takes place while evaluating the first subtask. Therefore maybe you should either use more than a few subtasks to estimate the running time, or exclude the first subtask from the timing.
您好,运行了您的代码之后,有以下几个问题想要请教:
1. 计算一个perfect sample所需时间与论文不一致 :
计算一个perfect sample需要缩并所有slice并将结果相加,但经过我的多次实测,示例代码所需的时间要比论文上的多,比如将 acqdp/examples/circuit_simulation.py 的154-163行中的 num_samps 设为1 来计算单个 tsk[i]:
2. 开源order的时间复杂度与论文不一致:
在示例代码(acqdp/examples/circuit_simulation.py)中加载您开源的orders并打印 order.cost,得到的最好时间复杂度为