Closed ACE07-Sev closed 1 year ago
@ACE07-Sev Hi, if this is a feature request you can open a Discussion, but please don't advertise it as an open issue in lambeq's repo. We've been working towards creating a set of issues that will be available for external contributors and appropriate for beginners. Thanks!
My apologies, may I open it in Discussions?
Yes, please open it as a feature request (in Ideas category), not as a task available to external contributions.
Note that if you plan to open this as a PR by yourself, you just need to submit the PR and we will review it. No need to open a corresponding issue.
Description
The choice of the feature map is perhaps the most important choice a QML engineer can make, as it will directly dictate the size of the circuit required. There are four common encoding algorithms :
There is a wide array of literature on each of the approaches mentioned, with each having its own list of algorithms. However, as a rule of thumb, amplitude encoders are more appropriate for large models given their logarithmic scaling in terms of number of qubits with respect to number of features to be embedded. Currently, Lambeq is using
IQPAnsatz
based on IQP Encoding which similarly to Angle encoding requires N qubits to encode N features. For now, we will ignore the depth scaling.The challenge is to create a new encoder class called
AmplitudeAnsatz
based on amplitude encoding, with the motivation of the efficiency of amplitude encoders for high dimensional data.Notes
AtomicType
)Model
is trained given the embedded circuits (going back to whatModel
is exactly)See Also
https://cqcl.github.io/lambeq/tutorials/extend-lambeq.html#Creating-ans%C3%A4tze https://arxiv.org/pdf/1804.11326.pdf https://docs.pennylane.ai/en/stable/code/api/pennylane.IQPEmbedding.html https://qiskit.org/documentation/stubs/qiskit.circuit.library.IQP.html https://qiskit.org/ecosystem/machine-learning/tutorials/01_neural_networks.html