Closed MatteoRobbiati closed 2 months ago
I just uploaded some further results. Some long training of 10q and 10l model to explore the learning rate value. The results can be summarized with this plot: lr_hyperopt.pdf.
Important: I am moving all the files here: https://mega.nz/folder/tewlwBzI#0lW4fvTiaFD1KvSXivsn3A.
I close this PR, since we are saving all the data on mega.
I close this PR, since we are saving all the data on mega.
VQE training data
In the following I collect into a table the currently uploaded data obtained by training VQEs. In the table you can find some hyper-parameters, but much more informations can be extracted from the
optimization_results.json
file which is in each folder.Summary table
In the following, when a list is reported it means that the target training setup has been repeated for all the elements of the list. E.g. the BFGS has been used on 8 qubits with 4, 5 and 6 layers. You can find all the results in the dedicated folders.
The data are organized into zipped files with names like:
{Optimizer}_{nqubits}q_{nlayers}l_{learning_rate}lr_{random_seed}s
.It should be quite easy to understand which training you are referring to.
Instructions to load the architectures and play with DBQA
First, you have to update the results into your computer and unzip these files. You can update the branch by switching to it and pulling:
Then you have to unzip them (do it manually or with the command
unzip your_file_to_be_unzipped.zip
).You can now use the script you find in the
extras
folder. Between lines 19 and 24 of this script you can set the filepath from which you want to upload results, the training status you want to upload (namely the parameters got at some certain point of the training) and the number of DBI steps.On top of this, please proceed with the task @marekgluza mentioned in the meeting.