qiskit-community / qiskit-machine-learning

Quantum Machine Learning
https://qiskit-community.github.io/qiskit-machine-learning/
Apache License 2.0
690 stars 327 forks source link

Can't compute gradient operator #101

Closed FrankFeenix closed 3 years ago

FrankFeenix commented 3 years ago

Information

What is the current behavior?

Failed to try the example of the first Machine Learning Programming Experiment in Qiskit: Cannot compute gradient operator! Continuing without gradients!

Steps to reproduce the problem

What is the expected behavior?

Run without issue

adekusar-drl commented 3 years ago

Can you please share more details?

FrankFeenix commented 3 years ago

@adekusar-drl As above, I said installed the library module then tried to try an example on ReadMe but got the same error as above. That is what I want to say

adekusar-drl commented 3 years ago

Thanks for noting this. This is not an error, but rather a warning in this case. We use COBYLA optimizer in this example, which is gradient free, so the example is still good.

moustafa7zada commented 2 years ago

@adekusar-drl sorry for bothering ... but the COBYLA optimizer still shows the same warnning for me(Cannot compute gradient operator! Continuing without gradients) any idea how this may be fixed ? note: i have the latest version of qisktit

adekusar-drl commented 2 years ago

@moustafa7zada In general, it is just a warning, not an error. That means, the code with this warning may still work correctly. Even if you use COBYLA you may see this warning. Since COBYLA is a gradient free optimizer, it does not require gradient evaluation, hence gradient is not required. Without seeing the code I can't say more about this issue.

moustafa7zada commented 2 years ago

@adekusar-drl thanks a lot for your help :) ..the code looks like this

num_qubits = 2      
depth = 4            
theta = ParameterVector('\u03B8', length=num_qubits)  # creating a list of Parameters
custom_circ = QuantumCircuit(num_qubits)

for _ in range(depth) :
    for i in range(num_qubits):
        custom_circ.rx(theta[i], i)        
    for j in range(num_qubits):
        for k in range(j) :
            custom_circ.cx(k , j)
    custom_circ.barrier()

qc = RawFeatureVector(4)
my_opt = COBYLA(maxiter=500, tol=0.001)
vqc = VQC(optimizer=my_opt , feature_map =qc , ansatz = custom_circ )
adekusar-drl commented 2 years ago

@moustafa7zada the reason is RawFeatureVector. The model can be trained using gradient free optimizers.

moustafa7zada commented 2 years ago

@adekusar-drl Okay i will go with COBYLA... thank you for your time. just one last question . is COBYLA the only gradient free optimizer in qiskit?

adekusar-drl commented 2 years ago

Explore the package https://qiskit.org/documentation/stubs/qiskit.algorithms.optimizers.html For instance, SPSA is also gradient free. It estimates gradient without direct invocations of the gradient framework. If it is unclear from the documentation, you may query gradient_support_level:

opt = COBYLA()
opt.gradient_support_level
Out[13]: <OptimizerSupportLevel.ignored: 1>

opt = ADAM()
opt.gradient_support_level
Out[16]: <OptimizerSupportLevel.supported: 2>