onnxruntime / Whisper-HybridLoop-Onnx-Demo

MIT License
13 stars 2 forks source link

Error Node (BeamSearch_node) has input size 12 not in range [min=5, max=10] #2

Open sergiosolorzano opened 11 months ago

sergiosolorzano commented 11 months ago

I have tried with python 3.10 and 3.11 to create an onnx whisper-tiny model. I create a conda env for this and follow microsoft's olive repo.

For example, I first clone the Olive repo and switch branch git checkout tags/v0.2.0 (i tried them all) cd Olive create conda env for python 3.11 python -m pip install .

Then in examples/whisper: python -m pip install -r requirements.txt python -m pip uninstall -y onnxruntime ort-nightly python -m pip install ort-nightly --index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/

Throws error when I build in python 11: (env_olive311) sergio@Ubuntu-2204-oai:~/PythonWorkspace/Olive/examples/whisper$ python prepare_whisper_configs.py --model_name openai/whisper-tiny.en Traceback (most recent call last): File "/home/sergio/PythonWorkspace/Olive/examples/whisper/prepare_whisper_configs.py", line 231, in <module> main() File "/home/sergio/PythonWorkspace/Olive/examples/whisper/prepare_whisper_configs.py", line 39, in main whisper_model = get_ort_whisper_for_conditional_generation(args.model_name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/sergio/anaconda3/envs/env_olive311/lib/python3.11/site-packages/olive/hf_utils.py", line 59, in get_ort_whisper_for_conditional_generation decoder = WhisperDecoder(model, None, model.config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: WhisperDecoder.__init__() takes 3 positional arguments but 4 were given (env_olive311) sergio@Ubuntu-2204-oai:~/PythonWorkspace/Olive/examples/whisper$


If I manage to build the onnx model cloning with python 3.10 doing the same process, I get the error below when using the model in Microsoft's demo https://github.com/onnxruntime/Whisper-HybridLoop-Onnx-Demo/tree/main/AudioNoteTranscription

OnnxRuntimeException: [ErrorCode:InvalidGraph] Load model from C:/AR-VR-Github/UnitySentisStableDiffusion-And-Whisper/Assets/StreamingAssets/whisper/model.onnx failed:This is an invalid model. In Node, ("BeamSearch_node", BeamSearch, "com.microsoft", -1) : ("log_mel": tensor(float),"max_length": tensor(int32),"min_length": tensor(int32),"num_beams": tensor(int32),"num_return_sequences": tensor(int32),"length_penalty": tensor(float),"repetition_penalty": tensor(float),"","","","","",) -> ("sequences",) , Error Node (BeamSearch_node) has input size 12 not in range [min=5, max=10]. Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess (System.IntPtr nativeStatus) (at <36441e0316944e7eb9fd86bf4a9a5a82>:0) Microsoft.ML.OnnxRuntime.InferenceSession.Init (System.String modelPath, Microsoft.ML.OnnxRuntime.SessionOptions options, Microsoft.ML.OnnxRuntime.PrePackedWeightsContainer prepackedWeightsContainer) (at <36441e0316944e7eb9fd86bf4a9a5a82>:0) Microsoft.ML.OnnxRuntime.InferenceSession..ctor (System.String modelPath, Microsoft.ML.OnnxRuntime.SessionOptions options) (at <36441e0316944e7eb9fd86bf4a9a5a82>:0)

Since this involves both repos, I have posted at https://github.com/microsoft/Olive/issues/477

DimQ1 commented 7 months ago

resolved by using Microsoft.ML.OnnxRuntime 1.16.3 and remove from app Microsoft.ML.OnnxRuntime.Azure