Closed FrankFeenix closed 3 years ago
Can you please share more details?
@adekusar-drl As above, I said installed the library module then tried to try an example on ReadMe but got the same error as above. That is what I want to say
Thanks for noting this. This is not an error, but rather a warning in this case. We use COBYLA
optimizer in this example, which is gradient free, so the example is still good.
@adekusar-drl
sorry for bothering ... but the COBYLA
optimizer still shows the same warnning for me(Cannot compute gradient operator! Continuing without gradients)
any idea how this may be fixed ?
note: i have the latest version of qisktit
@moustafa7zada In general, it is just a warning, not an error. That means, the code with this warning may still work correctly. Even if you use COBYLA
you may see this warning. Since COBYLA
is a gradient free optimizer, it does not require gradient evaluation, hence gradient is not required. Without seeing the code I can't say more about this issue.
@adekusar-drl thanks a lot for your help :) ..the code looks like this
num_qubits = 2
depth = 4
theta = ParameterVector('\u03B8', length=num_qubits) # creating a list of Parameters
custom_circ = QuantumCircuit(num_qubits)
for _ in range(depth) :
for i in range(num_qubits):
custom_circ.rx(theta[i], i)
for j in range(num_qubits):
for k in range(j) :
custom_circ.cx(k , j)
custom_circ.barrier()
qc = RawFeatureVector(4)
my_opt = COBYLA(maxiter=500, tol=0.001)
vqc = VQC(optimizer=my_opt , feature_map =qc , ansatz = custom_circ )
@moustafa7zada the reason is RawFeatureVector
. The model can be trained using gradient free optimizers.
@adekusar-drl Okay i will go with COBYLA
... thank you for your time.
just one last question . is COBYLA
the only gradient free optimizer in qiskit?
Explore the package https://qiskit.org/documentation/stubs/qiskit.algorithms.optimizers.html
For instance, SPSA
is also gradient free. It estimates gradient without direct invocations of the gradient framework. If it is unclear from the documentation, you may query gradient_support_level
:
opt = COBYLA()
opt.gradient_support_level
Out[13]: <OptimizerSupportLevel.ignored: 1>
opt = ADAM()
opt.gradient_support_level
Out[16]: <OptimizerSupportLevel.supported: 2>
Information
What is the current behavior?
Failed to try the example of the first Machine Learning Programming Experiment in Qiskit: Cannot compute gradient operator! Continuing without gradients!
Steps to reproduce the problem
What is the expected behavior?
Run without issue