Open tguillaume opened 6 months ago
Hi @tguillaume, thank you for reporting this bug!
My colleague @sylviemonet has made two PRs to fix this: #746 and #747.
Unfortunately, there seem to be all sorts of problems with the automated checks that need to pass for these PRs to make their way into master. My recommendation in that case is that you make a Fork of the Strawberry Fields repository and add the changes in your own Fork. Hopefully this can help you bypass the failing checks and work with the fixes!
Thank you for looking into this so quickly! Setting hbar=2 seems to solve the issues for my purposes. But it's useful to see these PRs.
That's great to hear @tguillaume! Thanks for letting us know.
Before posting a bug report
Expected behavior
Consider a circuit in the Gaussian backend consisting of nModes where the first mode is initialized in a thermal state with average photon number 0.01. I expect that:
(i) Computed Fock probabilities should not depend on the hbar convention (ii) A thermal state (in Gaussian backend) should always be mixed, regardless of number of modes in circuit and hbar convention
Actual behavior
(i) Fock probabilities depend on hbar convention (ii) Purity depends on hbar convention and size of circuit
Reproduces how often
All the time
System information
Source code
Tracebacks
No response
Additional information
I dug a little bit, and this issue may stem from the fact that the state is wrongly flagged as pure (some precision limit?) Because the SF object is considered pure, the method fock_prob uses the function thewalrus.twq.pure_state_amplitude rather than thewalrus.density_matrix_element. But it's unclear to me why the state is wrongly flagged as pure when hbar=1 (but correctly flagged as mixed when hbar=2). Also, this issue might potentially be similar to #488. However, unlike that issue, I only see the incorrect pure flag when I set nModes >= 15.