qiskit-community / qiskit-aqua

Quantum Algorithms & Applications (**DEPRECATED** since April 2021 - see readme for more info)
https://qiskit.org/aqua
Apache License 2.0
571 stars 377 forks source link

initializing AQGD optimizer #768

Closed kareem1925 closed 4 years ago

kareem1925 commented 4 years ago

Informations

an error happens when i try to initialize the AQGD

SchemaError: True is not of type 'number'

Failed validating 'type' in metaschema['properties']['properties']['additionalProperties']['properties']['exclusiveMaximum']: {'type': 'number'}

On schema['properties']['momentum']['exclusiveMaximum']: True

Steps to reproduce the problem

from qiskit.aqua.components import optimizers
opt = optimizers.AQGD()

What is the expected behavior?

I expect that nothing wrong happens just like the following code: opt = optimizers.ADAM(maxiter=20,lr=0.2) there is no errors while executing this line

Suggested solutions

I believe that there is something wrong happens while validating the configuration file of this optimizer. I think modifying it may help, I'm still searching for it

woodsp-ibm commented 4 years ago

At some point there was a change to the json schema specification definition for exclusiveMaximum and we corrected our code to match the latest schema spec in 0.6.2. Try installing the latest released version of Aqua which is 0.6.2 pip install -U qiskit-aqua. This was fixed for that version.

kareem1925 commented 4 years ago

thanks a lot. I had to upgrade the whole qiskit package to make it work perfectly (Y) sudo -H pip install qiskit --upgrade thanks again (Y)

kareem1925 commented 4 years ago

I'm sorry to close the issue without permission. actually there is something else: in this line @woodsp-ibm i believe you should omit the "not" because the while loop produces an error. when I changed it everything works perfectly. another thing: the typical value of the momentum in neural networks is 0.9. but you set it to 0.25 and then subtract that from 1. it confused me and may confuse other people I hope you consider changing it to the default value like other deep leanring libraries. I'll be glad if you allow me to help with that.

woodsp-ibm commented 4 years ago

I had expected you to close this issue since you created it and a fix was given which worked - you are permitted to do this as well as re-open, so no problem :)

The issue arises now from this line https://github.com/Qiskit/qiskit-aqua/blob/00ed44162b9cae91ef38098c0ed8bb1ee0c8e9fc/qiskit/aqua/components/optimizers/aqgd.py#L67 that was not in the original code but got added because of lint check failure. The converged method check https://github.com/Qiskit/qiskit-aqua/blob/00ed44162b9cae91ef38098c0ed8bb1ee0c8e9fc/qiskit/aqua/components/optimizers/aqgd.py#L132 needed that to create the instance variable if it did not exist - which first time in that call in the past it would have been correct. Since the variable has been created in the current code - which maybe is better option - the test now should more be if self._previous_loss is None: rather than using the has_attr logic.

As to momentum its used as per this line where both the value and 1 minus the value are used. Is it this line you are questioning? https://github.com/Qiskit/qiskit-aqua/blob/00ed44162b9cae91ef38098c0ed8bb1ee0c8e9fc/qiskit/aqua/components/optimizers/aqgd.py#L114

kareem1925 commented 4 years ago

hi, thanks for your reply.

needed that to create the instance variable if it did not exist - which first time in that call in the past it would have been correct. Since the variable has been created in the current code - which maybe is better option - the test now should more be if self._previous_ls_loss is None: rather than using the has_attr logic.

should I change it and make a pull request? it's working now on my machine and also tested it on a windows machine and colab

As to momentum its used as per this line where both the value and 1 minus the value are used. Is it this line you are questioning? https://github.com/Qiskit/qiskit-aqua/blob/00ed44162b9cae91ef38098c0ed8bb1ee0c8e9fc/qiskit/aqua/components/optimizers/aqgd.py#L114

yeah that's the line. I'm actually accustomed to 0.9 as it's the default value in tensorflow and pytorch libraries. when i was trying to explain quantum gradients to my colleagues it stopped me for a while actually :sweat_smile: so i had to implement the whole algorithm from scratch to keep it consistent with other ML libraries flow. it's only a suggestion.

woodsp-ibm commented 4 years ago

Hi, sure having a PR to fix the if self_previous_loss test would be great.

I recall the defaults were chosen based on some testing that was done - mostly to do analytic gradient of RY var form when used in VQE doing a ground state energy computation, where these seemed to do well. I was looking at pytorch SQD and they default to 0 - were you looking at some other code? https://github.com/pytorch/pytorch/blob/927c2a02b0b29a0fafcced8d65896dd417023067/torch/optim/sgd.py#L51 It is a default after all and you are of course free to use 0.9 rather that 0.25

kareem1925 commented 4 years ago

of course, the default would be zero for a pure SGD as you mentioned sir, but with momentum most of the people use 0.9 like this simple example from keras. the first code. i was not aware of the RY thing. I'll check it some other time. and I'll work on the PR right away (Y) thanks for your permission

woodsp-ibm commented 4 years ago

Closing as this was fixed by #770

kareem1925 commented 4 years ago

hi @woodsp-ibm, i'm now having the same issue because of the while loop :cry: in this line. #770 I thought that this pull request should have solved the problem and it actually solved the problem locally on my device

woodsp-ibm commented 4 years ago

Hi @kareem1925 the PR was for the master branch. This has not yet been released but will be part of the upcoming 0.7.0 currently planned for March. In the master the line has changed https://github.com/Qiskit/qiskit-aqua/blob/769ca8f7fbb91fcfb4dd47c956b6358bb53212ef/qiskit/aqua/components/optimizers/aqgd.py#L141 so installing qiskit from source (clones) of the repo should work. So if you installed a stable release 0.6.4 or earlier it will still be as you reported and only master was changed by the PR.