XanaduAI / thewalrus

A library for the calculation of hafnians, Hermite polynomials and Gaussian boson sampling.
https://the-walrus.readthedocs.io
Apache License 2.0
99 stars 54 forks source link

Displaced torontonian sampling time increase #342

Closed amanuelanteneh closed 2 years ago

amanuelanteneh commented 2 years ago

Before posting a bug report

Expected behavior

When using torontonian_sample_state to generate samples from a Gaussian state specified by a covariance matrix cov the time taken to return the same amount of samples from a displaced Gaussian state as opposed to one with zero displacement is significantly more. For example when I specify to generate 250 samples from a set of Gaussian states with average photon number of 3 and max photon number of 15 with no displacement the program finishes in about ~8 hours. When I specify a mu of 0.25 or 0.10 on each mode the program should take longer as there are more probabilities to calculate but not as long as it does.

Actual behavior

The program doesn't finish (>7 days) when the mu parameter is a 2N list of all 0.25's or all 0.10's. This is confusing as I don't think displacement of this magnitude should have this much of an affect on runtime. Perhaps I am using the mu argument incorrectly?

Reproduces how often

Two times

System information

The program didn't finish in time so the output is not included.

Source code

modes = adjMatrix.shape[0]
A = adjMatrix
Q = gen_Qmat_from_graph(A, 3)
displacement = 0.25
V = Covmat(Q, hbar=2)
d = np.full( shape=(2*modes), fill_value=displacement, dtype=np.float64 ) 

samples = torontonian_sample_state(cov=V, mu=d, samples=250, max_photons=15)

Tracebacks

No response

Additional information

No response

CatalinaAlbornoz commented 2 years ago

Hi @amanuelanteneh, thanks for reporting this. Can you please post the output of running: import thewalrus; thewalrus.about() ?

Also, if you can post your full code it will make it easier to see if we can reproduce your problem.

amanuelanteneh commented 2 years ago

Hi @CatalinaAlbornoz, Here is the output of thewalrus.about():

The Walrus: a Python library for for the calculation of hafnians, Hermite polynomials, and Gaussian boson sampling.
Copyright 2018-2021 Xanadu Quantum Technologies Inc.

Python version:            3.8.8
Platform info:             Linux-3.10.0-1127.19.1.el7.x86_64-x86_64-with-glibc2.10
Installation path:         /home/ph/.local/lib/python3.8/site-packages/thewalrus
The Walrus version:        0.19.0
Numpy version:             1.19.2
Scipy version:             1.5.2
SymPy version:             1.6.2
Numba version:             0.51.2

This is the full function that i use to generate the samples and the only place I use functions from the walrus:

def generateSamples(adjMatrix, numSamples, meanN, displacement, maxPhotons):
  modes = adjMatrix.shape[0]

  A = adjMatrix
  Q = gen_Qmat_from_graph(A, meanN)
  V = Covmat(Q, hbar=2) #the covaraince matrix of the pure gaussian state 
  d = np.full( shape=(2*modes), fill_value=displacement, dtype=np.float64 ) 
  samples = torontonian_sample_state(cov=V, mu=d, samples=numSamples, max_photons=maxPhotons)

  return(samples)

It takes in the adjacency matrix of a graph and uses functions from the walrus to embed the matrix into a GBS device which i then sample from. I call this function multiple times for a dataset of graphs. The function seems to run in the expected amount of time when displacement equals 0 but for displacements of 0.1 and 0.25 on each mode the runtime increases very drastically (from about 8 hours to over 7 days) which i don't believe should be the case.

CatalinaAlbornoz commented 2 years ago

Thank you @amanuelanteneh. We will take a look into this.

amanuelanteneh commented 2 years ago

@CatalinaAlbornoz Thanks! It might also be that this isn't how the displacement is meant to be specified however I can't find any example code on how to sample from a displaced Gaussian state using thewalrus so I'm unsure.

ziofil commented 2 years ago

The slowdown should only be 2x. How many modes are you working with?

amanuelanteneh commented 2 years ago

@ziofil on average the number of nodes in the graphs I'm looking at is about 25 so around 50 modes on average.

ziofil commented 2 years ago

Those are a lot of modes... I'm wondering if the slowdown is due to other effects (like memory swapping and so on). Could you try with fewer modes and see if the slowdown is the same?

ziofil commented 2 years ago

also, how sparse is your adjacency matrix?

amanuelanteneh commented 2 years ago

The adjacency matrices range in sparsity but the majority seem fairly sparse.

CatalinaAlbornoz commented 2 years ago

Thank you @amanuelanteneh. Does having fewer modes give you the same slowdown? Or do you have the same slowdown with fewer modes?

amanuelanteneh commented 2 years ago

Hello, I've been running some more simulations and it appears that the issue is no longer a problem. Sorry for the delay in response.