modelscope / FunASR

A Fundamental End-to-End Speech Recognition Toolkit and Open Source SOTA Pretrained Models, Supporting Speech Recognition, Voice Activity Detection, Text Post-processing etc.
https://www.funasr.com
Other
6.28k stars 671 forks source link

Funasr如何在Ubuntu18.04.6LTS上导出预训练模型的ONNX? #1899

Open LP-world2002 opened 3 months ago

LP-world2002 commented 3 months ago

Notice: In order to resolve issues more efficiently, please raise issue following the template. (注意:为了更加高效率解决您遇到的问题,请按照模板提问,补充细节)

❓ Questions and Help

How does Funasr export ONNX for pre-trained models in Ubuntu18.04.6LTS? 【Funasr如何在Ubuntu18.04.6LTS上导出预训练模型的ONNX?】

Before asking:

  1. search the issues.
  2. search the docs.

What is your question?

I used the python method in the Funasr documentation for exporting ONNX models to try to export the ONNX of the pretrained model paraex-en-Streaming, but kept getting errors! 【我使用Funasr文档中导出ONNX模型的python方法尝试导出paraformer-zh-streaming这个预训练模型的ONNX,但一直出现错误!】

Code

(funasr_env) lipeng@lipeng:~/share/modules$ vim export_ONNX_1.py
(funasr_env) lipeng@lipeng:~/share/modules$ cat export_ONNX_1.py 
from funasr import AutoModel

model = AutoModel(model="paraformer-zh-streaming", device="cpu")

res = model.export(quantize=False)
(funasr_env) lipeng@lipeng:~/share/modules$ python export_ONNX_1.py 

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.0.0 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/lipeng/share/modules/export_ONNX_1.py", line 1, in <module>
    from funasr import AutoModel
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 37, in <module>
    import_submodules(__name__)
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 33, in import_submodules
    results.update(import_submodules(name))
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 27, in import_submodules
    results[name] = importlib.import_module(name)
  File "/home/lipeng/.pyenv/versions/3.12.0/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/lipeng/share/modules/FunASR/funasr/bin/train.py", line 20, in <module>
    from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/__init__.py", line 1, in <module>
    from ._flat_param import FlatParameter as FlatParameter
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_flat_param.py", line 30, in <module>
    from torch.distributed.fsdp._common_utils import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_common_utils.py", line 35, in <module>
    from torch.distributed.fsdp._fsdp_extensions import FSDPExtensions
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_fsdp_extensions.py", line 8, in <module>
    from torch.distributed._tensor import DeviceMesh, DTensor
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/__init__.py", line 6, in <module>
    import torch.distributed._tensor.ops
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/ops/__init__.py", line 2, in <module>
    from .embedding_ops import *  # noqa: F403
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/ops/embedding_ops.py", line 8, in <module>
    import torch.distributed._functional_collectives as funcol
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_functional_collectives.py", line 12, in <module>
    from . import _functional_collectives_impl as fun_col_impl
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_functional_collectives_impl.py", line 36, in <module>
    from torch._dynamo import assume_constant_result
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/__init__.py", line 2, in <module>
    from . import convert_frame, eval_frame, resume_execution
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/convert_frame.py", line 40, in <module>
    from . import config, exc, trace_rules
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/trace_rules.py", line 50, in <module>
    from .variables import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/variables/__init__.py", line 34, in <module>
    from .higher_order_ops import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/variables/higher_order_ops.py", line 13, in <module>
    import torch.onnx.operators
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/__init__.py", line 59, in <module>
    from ._internal.onnxruntime import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/_internal/onnxruntime.py", line 37, in <module>
    import onnxruntime  # type: ignore[import]
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/onnxruntime/__init__.py", line 23, in <module>
    from onnxruntime.capi._pybind_state import ExecutionMode  # noqa: F401
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/onnxruntime/capi/_pybind_state.py", line 32, in <module>
    from .onnxruntime_pybind11_state import *  # noqa
AttributeError: _ARRAY_API not found
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
You are using the latest version of funasr-1.0.28
2024-06-27 10:19:44,289 - modelscope - INFO - PyTorch version 2.3.1 Found.
2024-06-27 10:19:44,290 - modelscope - INFO - Loading ast index from /home/lipeng/.cache/modelscope/ast_indexer
2024-06-27 10:19:44,326 - modelscope - INFO - Loading done! Current index file version is 1.15.0, with md5 175badd7bee3549728a3e474785a2bcf and a total number of 980 components indexed
transformer is not installed, please install it if you want to use related modules
2024-06-27 10:19:44,716 - modelscope - WARNING - Using the master branch is fragile, please use it with caution!
2024-06-27 10:19:44,716 - modelscope - INFO - Use user-specified model revision: master
Traceback (most recent call last):
  File "/home/lipeng/share/modules/export_ONNX_1.py", line 5, in <module>
    res = model.export(quantize=False)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lipeng/share/modules/FunASR/funasr/auto/auto_model.py", line 613, in export
    export_dir = export_utils.export(model=model, data_in=data_list, **kwargs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lipeng/share/modules/FunASR/funasr/utils/export_utils.py", line 18, in export
    _onnx(
  File "/home/lipeng/share/modules/FunASR/funasr/utils/export_utils.py", line 57, in _onnx
    export_name = model.export_name + ".onnx"
                  ~~~~~~~~~~~~~~~~~~^~~~~~~~~
TypeError: unsupported operand type(s) for +: 'method' and 'str'
(funasr_env) lipeng@lipeng:~/share/modules$ 

What have you tried?

Try to use a different format, but an error still occurs! 【尝试使用另外的格式,但还是出现错误!】

(funasr_env) lipeng@lipeng:~/share/modules$ cat export_ONNX.py 
#!/usr/bin/env python3
# -*- encoding: utf-8 -*-
# Copyright FunASR (https://github.com/alibaba-damo-academy/FunASR). All Rights Reserved.
#  MIT License  (https://opensource.org/licenses/MIT)

# Import AutoModel from funasr
from funasr import AutoModel

# Method 1: Inference from model hub
model_hub = AutoModel(model="iic/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-online")

# Export the model as ONNX
res_hub = model_hub.export(type="onnx", quantize=False)
print("Model exported from model hub:", res_hub)

# Method 2: Inference from local path
local_model_path = "/home/lipeng/share/modules/output_ONNX"

model_local = AutoModel(model=local_model_path)

# Export the model as ONNX
res_local = model_local.export(type="onnx", quantize=False)
print("Model exported from local path:", res_local)

(funasr_env) lipeng@lipeng:~/share/modules$ python export_ONNX.py 

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.0.0 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/lipeng/share/modules/export_ONNX.py", line 7, in <module>
    from funasr import AutoModel
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 37, in <module>
    import_submodules(__name__)
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 33, in import_submodules
    results.update(import_submodules(name))
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 27, in import_submodules
    results[name] = importlib.import_module(name)
  File "/home/lipeng/.pyenv/versions/3.12.0/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/lipeng/share/modules/FunASR/funasr/bin/train.py", line 20, in <module>
    from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/__init__.py", line 1, in <module>
    from ._flat_param import FlatParameter as FlatParameter
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_flat_param.py", line 30, in <module>
    from torch.distributed.fsdp._common_utils import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_common_utils.py", line 35, in <module>
    from torch.distributed.fsdp._fsdp_extensions import FSDPExtensions
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_fsdp_extensions.py", line 8, in <module>
    from torch.distributed._tensor import DeviceMesh, DTensor
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/__init__.py", line 6, in <module>
    import torch.distributed._tensor.ops
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/ops/__init__.py", line 2, in <module>
    from .embedding_ops import *  # noqa: F403
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/ops/embedding_ops.py", line 8, in <module>
    import torch.distributed._functional_collectives as funcol
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_functional_collectives.py", line 12, in <module>
    from . import _functional_collectives_impl as fun_col_impl
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_functional_collectives_impl.py", line 36, in <module>
    from torch._dynamo import assume_constant_result
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/__init__.py", line 2, in <module>
    from . import convert_frame, eval_frame, resume_execution
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/convert_frame.py", line 40, in <module>
    from . import config, exc, trace_rules
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/trace_rules.py", line 50, in <module>
    from .variables import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/variables/__init__.py", line 34, in <module>
    from .higher_order_ops import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/variables/higher_order_ops.py", line 13, in <module>
    import torch.onnx.operators
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/__init__.py", line 59, in <module>
    from ._internal.onnxruntime import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/_internal/onnxruntime.py", line 37, in <module>
    import onnxruntime  # type: ignore[import]
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/onnxruntime/__init__.py", line 23, in <module>
    from onnxruntime.capi._pybind_state import ExecutionMode  # noqa: F401
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/onnxruntime/capi/_pybind_state.py", line 32, in <module>
    from .onnxruntime_pybind11_state import *  # noqa
AttributeError: _ARRAY_API not found
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
You are using the latest version of funasr-1.0.28
2024-06-27 10:22:01,591 - modelscope - INFO - PyTorch version 2.3.1 Found.
2024-06-27 10:22:01,591 - modelscope - INFO - Loading ast index from /home/lipeng/.cache/modelscope/ast_indexer
2024-06-27 10:22:01,627 - modelscope - INFO - Loading done! Current index file version is 1.15.0, with md5 175badd7bee3549728a3e474785a2bcf and a total number of 980 components indexed
transformer is not installed, please install it if you want to use related modules
2024-06-27 10:22:02,055 - modelscope - WARNING - Using the master branch is fragile, please use it with caution!
2024-06-27 10:22:02,055 - modelscope - INFO - Use user-specified model revision: master
Traceback (most recent call last):
  File "/home/lipeng/share/modules/export_ONNX.py", line 13, in <module>
    res_hub = model_hub.export(type="onnx", quantize=False)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lipeng/share/modules/FunASR/funasr/auto/auto_model.py", line 613, in export
    export_dir = export_utils.export(model=model, data_in=data_list, **kwargs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lipeng/share/modules/FunASR/funasr/utils/export_utils.py", line 18, in export
    _onnx(
  File "/home/lipeng/share/modules/FunASR/funasr/utils/export_utils.py", line 57, in _onnx
    export_name = model.export_name + ".onnx"
                  ~~~~~~~~~~~~~~~~~~^~~~~~~~~
TypeError: unsupported operand type(s) for +: 'method' and 'str'
(funasr_env) lipeng@lipeng:~/share/modules$ 

I again tried to export ONNX using the default python method in the documentation, in order to verify that there is a problem with the environment! No errors are reported, the environment is normal! 【我又尝试使用文档中默认的python方法导出ONNX,为了验证环境是否存在问题!没有出现报错,环境是正常的!】

(funasr_env) lipeng@lipeng:~/share/modules$ vim export_ONNX_2.py
(funasr_env) lipeng@lipeng:~/share/modules$ cat export_ONNX_2.py 
from funasr import AutoModel

model = AutoModel(model="paraformer", device="cpu")

res = model.export(quantize=False)
(funasr_env) lipeng@lipeng:~/share/modules$ python export_ONNX_2.py 

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.0.0 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/lipeng/share/modules/export_ONNX_2.py", line 1, in <module>
    from funasr import AutoModel
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 37, in <module>
    import_submodules(__name__)
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 33, in import_submodules
    results.update(import_submodules(name))
  File "/home/lipeng/share/modules/FunASR/funasr/__init__.py", line 27, in import_submodules
    results[name] = importlib.import_module(name)
  File "/home/lipeng/.pyenv/versions/3.12.0/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/lipeng/share/modules/FunASR/funasr/bin/train.py", line 20, in <module>
    from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/__init__.py", line 1, in <module>
    from ._flat_param import FlatParameter as FlatParameter
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_flat_param.py", line 30, in <module>
    from torch.distributed.fsdp._common_utils import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_common_utils.py", line 35, in <module>
    from torch.distributed.fsdp._fsdp_extensions import FSDPExtensions
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/fsdp/_fsdp_extensions.py", line 8, in <module>
    from torch.distributed._tensor import DeviceMesh, DTensor
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/__init__.py", line 6, in <module>
    import torch.distributed._tensor.ops
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/ops/__init__.py", line 2, in <module>
    from .embedding_ops import *  # noqa: F403
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_tensor/ops/embedding_ops.py", line 8, in <module>
    import torch.distributed._functional_collectives as funcol
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_functional_collectives.py", line 12, in <module>
    from . import _functional_collectives_impl as fun_col_impl
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/distributed/_functional_collectives_impl.py", line 36, in <module>
    from torch._dynamo import assume_constant_result
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/__init__.py", line 2, in <module>
    from . import convert_frame, eval_frame, resume_execution
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/convert_frame.py", line 40, in <module>
    from . import config, exc, trace_rules
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/trace_rules.py", line 50, in <module>
    from .variables import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/variables/__init__.py", line 34, in <module>
    from .higher_order_ops import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/_dynamo/variables/higher_order_ops.py", line 13, in <module>
    import torch.onnx.operators
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/__init__.py", line 59, in <module>
    from ._internal.onnxruntime import (
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/_internal/onnxruntime.py", line 37, in <module>
    import onnxruntime  # type: ignore[import]
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/onnxruntime/__init__.py", line 23, in <module>
    from onnxruntime.capi._pybind_state import ExecutionMode  # noqa: F401
  File "/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/onnxruntime/capi/_pybind_state.py", line 32, in <module>
    from .onnxruntime_pybind11_state import *  # noqa
AttributeError: _ARRAY_API not found
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
<unknown>:815: SyntaxWarning: invalid escape sequence '\w'
<unknown>:1371: SyntaxWarning: invalid escape sequence '\w'
You are using the latest version of funasr-1.0.28
2024-06-27 10:24:24,649 - modelscope - INFO - PyTorch version 2.3.1 Found.
2024-06-27 10:24:24,649 - modelscope - INFO - Loading ast index from /home/lipeng/.cache/modelscope/ast_indexer
2024-06-27 10:24:24,692 - modelscope - INFO - Loading done! Current index file version is 1.15.0, with md5 175badd7bee3549728a3e474785a2bcf and a total number of 980 components indexed
transformer is not installed, please install it if you want to use related modules
2024-06-27 10:24:25,105 - modelscope - WARNING - Using the master branch is fragile, please use it with caution!
2024-06-27 10:24:25,105 - modelscope - INFO - Use user-specified model revision: master
/home/lipeng/share/modules/FunASR/funasr/models/transformer/embedding.py:395: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.
  log_timescale_increment = torch.log(torch.tensor([10000], dtype=dtype, device=device)) / (
/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/_internal/jit_utils.py:307: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ../torch/csrc/jit/passes/onnx/constant_fold.cpp:179.)
  _C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/symbolic_opset10.py:531: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
  return g.op("Constant", value_t=torch.tensor(list_or_value))
/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/utils.py:702: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ../torch/csrc/jit/passes/onnx/constant_fold.cpp:179.)
  _C._jit_pass_onnx_graph_shape_type_inference(
/home/lipeng/share/modules/FunASR/funasr_env/lib/python3.12/site-packages/torch/onnx/utils.py:1208: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ../torch/csrc/jit/passes/onnx/constant_fold.cpp:179.)
  _C._jit_pass_onnx_graph_shape_type_inference(
output dir: /home/lipeng/.cache/modelscope/hub/iic/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-pytorch
(funasr_env) lipeng@lipeng:~/share/modules$ 

What's your environment?

OS (e.g., Linux): Ubuntu 18.04.6 LTS FunASR Version (e.g., 1.0.0): 1.0.28 ModelScope Version (e.g., 1.11.0): 1.15.0 PyTorch Version (e.g., 2.0.0): 2.3.1+cu121 How you installed funasr (pip, source): funasr 1.0.28 /home/lipeng/share/modules/FunASR Python version: Python 3.12.0 GPU (e.g., V100M32) CUDA/cuDNN version (e.g., cuda11.7): Docker version (e.g., funasr-runtime-sdk-cpu-0.4.1):Docker version 24.0.2, build cb74dfc Any other relevant information:

I am running Ubuntu18.04.6LTS on a VMware virtual machine, and I just want to export the ONNX of the pretrained model paraex-en-Streaming, but no matter what I do, I get an error! 【我是在VMware虚拟机上面运行的Ubuntu18.04.6LTS,我现在只想导出paraformer-zh-streaming这个预训练模型的ONNX,但无论怎么做都是报错!】

LauraGPT commented 2 months ago

numpy==1.29