qiskit-advocate / qamp-spring-23

Qiskit advocate mentorship program (QAMP) Spring 23 cohort (April - July 2023)
12 stars 2 forks source link

Train a quantum encoding of classical data as a density matrix with a good low-rank approximation #14

Open adekusar-drl opened 1 year ago

adekusar-drl commented 1 year ago

Description

Data representation as a density matrix of some quantum state is widely used in Quantum Machine Learning models. In many cases, the main task is to extract the eigenvalues and eigenvectors of those matrices. The question of extracting the eigenvalues and eigenvectors is particularly interesting when some of the eigenvalues are large (good low-rank approximation). However, a question of data representation by a density matrix with certain properties can be studied in more details.

In the paper “Covariance Matrix Preparation for Quantum Principal Component Analysis” covariance and density matrices corresponding to amplitude encoded data were studied. A robust method of representing real data as a density matrix suitable for the QML methods can make some QML models more efficient.

This project aims to train a quantum encoding of classical data as a density matrix with a good low-rank approximation. We aim to use the quantum Purity $P = Tr(\rho^2) = Tr(SWAP(\rho \otimes \rho))$. To simplify training we are planning to perform training in batches. The evaluation of the model will be done on the toy datasets such as ad hoc data, iris, and may be others.

This is a continuation of the work started in the previous term by https://github.com/VasilyBokov

Deliverables

We expect to have a new trainable feature map suitable for QML algorithms.

Mentors details

Number of mentees

1

Type of mentees

VasilyBokov commented 1 year ago

I am planning to perform this project as a continuation of the previous QAMP project on quantum kernel methods in machine learning models.

Siheon-Park commented 1 year ago

Hello Anton, I am interested in your project. I actually have been researching quantum kernel-based machine learning methods which are essentially PCA. I was gaining energy for the follow-up paper, and I found this issue. I believe we can discover something useful from this project.

adekusar-drl commented 1 year ago

@Siheon-Park this is a continuation of the previously started work. It is unlikely it would be assigned to somebody else, unfortunately. Thanks for the interest.

VasilyBokov commented 1 year ago

For now, we have developed an approach for training quantum kernels in machine learning tasks. Traditionally, pairwise calculations have been required to evaluate the distance between density matrices, posing computational challenges for large datasets. We propose a cost function based on the Hilbert-Schmidt distance, which allows us to avoid pairwise calculations and estimate distances solely through SWAP tests.

The algorithm begins by dividing the dataset into two equal parts representing different classes. We construct density matrices, proportional to the kernels, for each class. By associating these kernels with density matrices, we can measure the distance between them using the Hilbert-Schmidt distance. We demonstrate how each term in the distance expression can be estimated using SWAP tests. Below you can find pictures of two kernels before and after the training, respectively.

We employ this distance as a cost function during the training process to optimize the encoding parameters in the unitary operator. By minimizing the Hilbert-Schmidt distance, we aim to find the optimal encoding that enhances class separation. Our approach eliminates the need for time-consuming pairwise calculations and offers a computationally efficient alternative using SWAP tests, although using a large amount of multiple controlled gates. In order to make the circuit more shallow while still avoiding pair-wise calculations we aim now to implement a batch-based training.

Overall, our work contributes to the advancement of quantum embeddings in machine learning.

Screenshot from 2023-06-12 15-17-17

Screenshot from 2023-06-12 15-17-28