TeamGraphix / graphix

measurement-based quantum computing (MBQC) compiler and simulator
https://graphix.readthedocs.io
Apache License 2.0
55 stars 20 forks source link

Support for arbitrary input states in TensorNetworkBackend #167

Open thierry-martinez opened 2 weeks ago

thierry-martinez commented 2 weeks ago

Is your feature request related to a problem? Please describe. TensorNetworkBackend currently only supports |+> input states.

Describe the feature you'd like We should be able to pass arbitrary states as input_states to TensorNetworkBackend.

Additional context Support for input states has been added for other back-ends in https://github.com/TeamGraphix/graphix/pull/135.

masa10-f commented 2 weeks ago

Hi! Thanks for you feature request Let me ask, which do you assume a highly entangled state or a sparsely entangled state as a input_state? Do you want to initialize with an arbitrary state vector?

mgarnier59 commented 2 weeks ago

Hi Masato,

that will be one of the topics of our discussion in July with @shinich1 and @thierry-martinez. We want to have a better grasp of this backend (why it works, when it works, its relation to more traditional TN backends, how it can be extended, etc...).

Indeed, we are currently working on refactoring the backends to make the API simpler ans safer. Additionally, since this backend is very specific, the latest features are not implemented there (some of them most likely can't be, like truly arbitrary input states) so it needs special treatment.

masa10-f commented 2 weeks ago

Hi Maxime! Thank you for your response.

that will be one of the topics of our discussion in July with @shinich1 and @thierry-martinez. We want to have a better grasp of this backend (why it works, when it works, its relation to more traditional TN backends, how it can be extended, etc...).

I'm sorry for not being able to join the discussion. Here are quick answers to your wondering

Indeed, we are currently working on refactoring the backends to make the API simpler ans safer.

Thanks a lot for your commitment.

Additionally, since this backend is very specific, the latest features are not implemented there (some of them most likely can't be, like truly arbitrary input states) so it needs special treatment.

Yes, I agree with you. I recognize that the coding of this backend is not very sophisticated, and some useless methods are still present(graph_prep op.). I can refactor this backend and implement arbitrary input states. Has your team already started to refactor the TN backend? If so please let me know. I do not want to interfere.

some of them most likely can't be, like truly arbitrary input states

For this part, everything, except for prob calc, implemented in SV backend is possible. We just need to prepare the corresponding input tensor. I was just concerned about the large qubit number state(>30) in the former question.

mgarnier59 commented 2 weeks ago

Thanks a lot!

Yes indeed too bad you can't be there but we might schedule a call at some point. Or later to let you know what happened.

why it works TN is just a generalization of usual statevec sim from the perspective of matrix multiplication. The only difference is in the contraction order. when it works When we calculate Unitary, because we can skip middle measurement probability calculation and change the contraction order from the SV backend. This is strict especially when a pattern is strongly deterministic because the probability of 0/1 for all the measurement planes must be 50:50. The important point is whether it's permitted to skip probability calculations its relation to more traditional TN backends TN sim in graphix is just a naive implementation(connecting tensors one by one following pattern commands. We use quimb contraction backend). There's currently no truncation option as in some TN backends(like MPS).

That's what I add in mind, thanks. However, I don't yet see how you generalized the paper by Eisert and all to arbitrary graph (maybe we can talk about it via Discord?). I planned to discuss that with Shinichi in July anyways.

Has your team already started to refactor the TN backend? If so please let me know. I do not want to interfere.

No, we're not touching this backend until we understand it better. Just trying to maintain compatibility and keeping in mind that we don't want it to be too far behind!

For this part, everything, except for prob calc, implemented in SV backend is possible. We just need to prepare the corresponding input tensor. I was just concerned about the large qubit number state(>30) in the former question.

Great, that's very helpful.