Closed larenzhang closed 1 year ago
Hi, we have not planned to support RNNs. But we plan to support transformer models, which are also popular in NLP field.
This issue has not received any updates in 120 days. Please reply to this issue if this still unresolved!
It is a nice work to benchmark quantization methods for various CNN architectures. Recurrent neural network is another mainstrain architecture, which is widly used as a times series model in edge-device. Quantization can also be used to recurrent neural network, such as LSTM, GRU and etc. I am curious that is there any plan to benchmark the quantization of recurrent neural network.
Look forward to your response. Best wishes!