CQCL / lambeq

A high-level Python library for Quantum Natural Language Processing
https://cqcl.github.io/lambeq-docs
Apache License 2.0
451 stars 108 forks source link

Slow performance of trainer #25

Closed Stephenito closed 2 years ago

Stephenito commented 2 years ago

Hi, i am running the py quantum trainer example on the documentation (with AerBackend and TketModel), and the training time for each epoch is almost 1 minute. I would like to know how can i speed it up, what may be the most probable bottleneck. I am using 100 samples for the training set and 30 for testing. Thanks.

y-richie-y commented 2 years ago

Hi, the AerBackend can be used with TketModel to perform a noisy, architecture-aware simulation of an IBM machine. However for initial experiments we recommend using NumpyModel, as it performs noiseless simulations and is orders of magnitude faster.

Stephenito commented 2 years ago

Thanks, i tried with NumpyModel and it performs approximately 5 times faster.

dimkart commented 2 years ago

This will be now closed.