XanaduAI / thewalrus

A library for the calculation of hafnians, Hermite polynomials and Gaussian boson sampling.
https://the-walrus.readthedocs.io
Apache License 2.0
99 stars 54 forks source link

Internal modes #354

Open DavidSPhillips opened 1 year ago

DavidSPhillips commented 1 year ago

Context: Calculating probability distributions and density matrices for partially distinguishable squeezed states with multiple internal modes through an interferometer with PNR detection.

Currently in thewalrus we only have the ability to simulate interference between Gaussian states which are either indistinguishable or fully distinguishable, and with only one internal mode per spatial mode. There exists no framework to simulate partially distinguishable states or states with more than one internal mode per spatial mode (e.g. when we have multiple Schmidt modes). There are code repos that already simulate internal modes and partial distinguishability, two_squeezer_interference for two degenerate squeezers, and GKP_Multimode_Modelling for the simulation of GKP states from GBS devices, however both those approaches use a combinatorial approach and therefore are inherently slow.

Using the same orthonormalisation procedures as those two named repos, the additions to this repo employs a new algorithm called the 'Internal Modes Haffnian' which cuts out unnecessary calculations from the combinatorial approach, and thus speeds up the calculations.

Description of the Change: Nearly all of the changes are confined to the new directory internal_modes within the main thewalrus directory, and is reasonably self-contained and well documented. The unit tests for these functions live in the test_internal_modes.py script in the tests directory, and should be comprehensive though pending a codecov check. The main functions which will be the most user-facing are: distinguishable_pnr_prob, density_matrix_single_mode, and pnr_prob. The remainder of the functions are support functions.

The only change in the main thewalrus section is the Takagi decomposition. There is a function called autonne which lives in the symplectic.py script, which performs a Takagi-Autonne decomposition of a symmetric matrix. There is also a function in strawberryfields called takagi which performs the same decomposition using a different (and arguably more robust) algorithm. This function has been added to thewalrus with some additions to make it even more robust. Additionally, both functions now live in the decompositions.py script, and given they have different names they coexist without any problems. This decomposition is needed for some functions in the internal_modes directory, and it's the takagi version of the function that’s used there. Tests for both decomposition functions now live in test_decompositions.py.

Benefits:

Possible Drawbacks:

Related GitHub Issues: None

codecov[bot] commented 1 year ago

Codecov Report

Attention: Patch coverage is 99.53704% with 1 line in your changes missing coverage. Please review.

Project coverage is 99.95%. Comparing base (f935053) to head (f7358e6).

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #354 +/- ## =========================================== - Coverage 100.00% 99.95% -0.05% =========================================== Files 28 33 +5 Lines 1996 2117 +121 =========================================== + Hits 1996 2116 +120 - Misses 0 1 +1 ``` | [Files](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?dropdown=coverage&src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI) | Coverage Δ | | |---|---|---| | [thewalrus/\_hafnian.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2F_hafnian.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL19oYWZuaWFuLnB5) | `100.00% <ø> (ø)` | | | [thewalrus/decompositions.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Fdecompositions.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL2RlY29tcG9zaXRpb25zLnB5) | `100.00% <ø> (ø)` | | | [thewalrus/internal\_modes/\_\_init\_\_.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Finternal_modes%2F__init__.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL2ludGVybmFsX21vZGVzL19faW5pdF9fLnB5) | `100.00% <100.00%> (ø)` | | | [thewalrus/internal\_modes/pnr\_statistics.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Finternal_modes%2Fpnr_statistics.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL2ludGVybmFsX21vZGVzL3Bucl9zdGF0aXN0aWNzLnB5) | `100.00% <100.00%> (ø)` | | | [thewalrus/internal\_modes/prepare\_cov.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Finternal_modes%2Fprepare_cov.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL2ludGVybmFsX21vZGVzL3ByZXBhcmVfY292LnB5) | `100.00% <100.00%> (ø)` | | | [thewalrus/internal\_modes/utils.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Finternal_modes%2Futils.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL2ludGVybmFsX21vZGVzL3V0aWxzLnB5) | `100.00% <100.00%> (ø)` | | | [thewalrus/symplectic.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Fsymplectic.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL3N5bXBsZWN0aWMucHk=) | `100.00% <ø> (ø)` | | | [thewalrus/internal\_modes/fock\_density\_matrices.py](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?src=pr&el=tree&filepath=thewalrus%2Finternal_modes%2Ffock_density_matrices.py&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI#diff-dGhld2FscnVzL2ludGVybmFsX21vZGVzL2ZvY2tfZGVuc2l0eV9tYXRyaWNlcy5weQ==) | `98.52% <98.52%> (ø)` | | ... and [24 files with indirect coverage changes](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI) ------ [Continue to review full report in Codecov by Sentry](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?dropdown=coverage&src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?dropdown=coverage&src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI). Last update [f935053...f7358e6](https://app.codecov.io/gh/XanaduAI/thewalrus/pull/354?dropdown=coverage&src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=XanaduAI).
timmysilv commented 2 weeks ago

the cause of the segfault has been found! You cannot use asserts within numba.prange loop contexts as described in the second note of their docs on parallelization. Once I removed that line, I was told: Use of reduction variable 'haf_arr' other than in a supported reduction function is not permitted - updated the loop to not use it, and everything worked!

nquesada commented 2 weeks ago

Wow. That is crazy! Thanks so much!