Open DavidSPhillips opened 1 year ago
Attention: Patch coverage is 99.53704%
with 1 line
in your changes missing coverage. Please review.
Project coverage is 99.95%. Comparing base (
f935053
) to head (f7358e6
).
the cause of the segfault has been found! You cannot use asserts within numba.prange
loop contexts as described in the second note of their docs on parallelization. Once I removed that line, I was told: Use of reduction variable 'haf_arr' other than in a supported reduction function is not permitted
- updated the loop to not use it, and everything worked!
Wow. That is crazy! Thanks so much!
Context: Calculating probability distributions and density matrices for partially distinguishable squeezed states with multiple internal modes through an interferometer with PNR detection.
Currently in
thewalrus
we only have the ability to simulate interference between Gaussian states which are either indistinguishable or fully distinguishable, and with only one internal mode per spatial mode. There exists no framework to simulate partially distinguishable states or states with more than one internal mode per spatial mode (e.g. when we have multiple Schmidt modes). There are code repos that already simulate internal modes and partial distinguishability,two_squeezer_interference
for two degenerate squeezers, andGKP_Multimode_Modelling
for the simulation of GKP states from GBS devices, however both those approaches use a combinatorial approach and therefore are inherently slow.Using the same orthonormalisation procedures as those two named repos, the additions to this repo employs a new algorithm called the 'Internal Modes Haffnian' which cuts out unnecessary calculations from the combinatorial approach, and thus speeds up the calculations.
Description of the Change: Nearly all of the changes are confined to the new directory
internal_modes
within the mainthewalrus
directory, and is reasonably self-contained and well documented. The unit tests for these functions live in thetest_internal_modes.py
script in thetests
directory, and should be comprehensive though pending a codecov check. The main functions which will be the most user-facing are:distinguishable_pnr_prob
,density_matrix_single_mode
, andpnr_prob
. The remainder of the functions are support functions.The only change in the main
thewalrus
section is the Takagi decomposition. There is a function calledautonne
which lives in thesymplectic.py
script, which performs a Takagi-Autonne decomposition of a symmetric matrix. There is also a function instrawberryfields
calledtakagi
which performs the same decomposition using a different (and arguably more robust) algorithm. This function has been added tothewalrus
with some additions to make it even more robust. Additionally, both functions now live in thedecompositions.py
script, and given they have different names they coexist without any problems. This decomposition is needed for some functions in theinternal_modes
directory, and it's thetakagi
version of the function that’s used there. Tests for both decomposition functions now live intest_decompositions.py
.Benefits:
thewalrus
now has the ability to simulate both multiple internal modes per spatial mode and partial distinguishability.Significantly faster computation for the listed specific system compared to the combinatorial approach in other repos.
Ability to define the mode shape of the desired output mode (e.g. for homodyne detection).
All dependencies on
strawberryfields
removed for the entirethewalrus
.Possible Drawbacks:
No drawbacks to pre-existing code.
New code only applies for squeezed states without displacement.
For density matrix calculations we're restricted to post-selecting in one mode only (which is mandatory), and only a single internal mode is given (user can specify the basis function for this mode if they want).
For photon-number distributions, only the probability for a particular pattern is given, the user must loop over all desired combinations for full distribution.
Related GitHub Issues: None