Closed ll971 closed 2 years ago
There are a number of methods to calculate gradients. I believe that tutorial uses the adjoint method, but another common method is the parameter shift rule. There is a whole section on gradients in the white paper (https://arxiv.org/pdf/2003.02989.pdf, page 17) if you want to know more. If you are more interested in the programmatic side, there is also: https://tensorflow.google.cn/quantum/tutorials/gradients
When I was learning about Tensorflow Quantum at https://tensorflow.google.cn/quantum/tutorials/qcnn, the output of the quantum convolution layer was selected to observe in the way of Z basis. I wanted to know how the hybrid quantum convolution neural network updated parameters. What is the gradient calculation methods? Why can classical optimizer such as Adam update quantum parameters in quantum part?