Open jaybinks opened 1 year ago
The same issue happens with medium and small models.
Incase it helps
pip list
Package Version
------------------ ---------
ane-transformers 0.1.1
certifi 2023.5.7
charset-normalizer 3.1.0
coremltools 6.3.0
ffmpeg-python 0.2.0
filelock 3.12.1
fsspec 2023.6.0
future 0.18.3
huggingface-hub 0.15.1
idna 3.4
Jinja2 3.1.2
llvmlite 0.40.1rc1
MarkupSafe 2.1.3
more-itertools 9.1.0
mpmath 1.3.0
networkx 3.1
numba 0.57.0
numpy 1.24.3
openai-whisper 20230314
packaging 23.1
pip 23.1.2
protobuf 3.20.1
PyYAML 6.0
regex 2023.6.3
requests 2.31.0
safetensors 0.3.1
setuptools 67.6.1
sympy 1.12
tiktoken 0.3.1
tokenizers 0.13.3
torch 2.0.0
tqdm 4.65.0
transformers 4.30.1
typing_extensions 4.6.3
urllib3 2.0.3
wheel 0.40.0
brew list --versions
aom 3.6.1
brotli 1.0.9
ca-certificates 2023-05-30
cairo 1.16.0_5
chromaprint 1.5.1_1
cjson 1.7.15
coreutils 9.3
dav1d 1.2.1
ffmpeg@4 4.4.4
fftw 3.3.10_1
flac 1.4.2
fontconfig 2.14.2
freetype 2.13.0_1
frei0r 1.8.0
fribidi 1.0.13
gcc 13.1.0
gettext 0.21.1
giflib 5.2.1
git 2.41.0
glib 2.76.3
gmp 6.2.1_1
gnutls 3.8.0
graphite2 1.3.14
harfbuzz 7.3.0
highway 1.0.4
htop 3.2.2
hwloc 2.9.1
icu4c 72.1
imath 3.1.9
isl 0.26
jpeg-turbo 2.1.5.1
jpeg-xl 0.8.1_3
lame 3.100
leptonica 1.82.0_2
libarchive 3.6.2_1
libass 0.17.1
libb2 0.98.1
libbluray 1.3.4
libevent 2.1.12
libidn2 2.3.4_1
libmpc 1.3.1
libnghttp2 1.54.0
libogg 1.3.5
libpng 1.6.39
librist 0.2.7_3
libsamplerate 0.2.2
libsndfile 1.2.0_1
libsodium 1.0.18_1
libsoxr 0.1.3
libtasn1 4.19.0
libtiff 4.5.0
libunibreak 5.1
libunistring 1.1
libvidstab 1.1.1
libvmaf 2.3.1
libvorbis 1.3.7
libvpx 1.13.0
libx11 1.8.4
libxau 1.0.11
libxcb 1.15_1
libxdmcp 1.1.4
libxext 1.3.5
libxrender 0.9.11
little-cms2 2.15
lz4 1.9.4
lzo 2.10
mad 0.15.1b
mbedtls 3.4.0
mpdecimal 2.5.1
mpfr 4.2.0-p9
mpg123 1.31.3
ncurses 6.4
nettle 3.9.1
open-mpi 4.1.5
opencore-amr 0.1.6
openexr 3.1.8_1
openjpeg 2.5.0_1
openssl@1.1 1.1.1u
openssl@3 3.1.1
opus 1.4
opusfile 0.12
p11-kit 0.24.1_1
pango 1.50.14
pcre2 10.42
pixman 0.42.2
pkg-config 0.29.2_3
python@3.11 3.11.4
rav1e 0.6.6
readline 8.2.1
rubberband 3.2.1
sdl2 2.26.5
snappy 1.1.10
sox 14.4.2_5
speex 1.2.1
sqlite 3.42.0
srt 1.5.1
tesseract 5.3.1
theora 1.1.1
unbound 1.17.1
webp 1.3.0_1
wget 1.21.4
x264 r3095
x265 3.5
xorgproto 2022.2
xvid 1.3.7
xz 5.4.3
zeromq 4.3.4
zimg 3.0.4
zstd 1.5.5
miniconda py310_23.3.1-0
More info about my env :
jaybinks@Jays-Mac-mini whisper.cpp % conda info
active environment : None
user config file : /Users/jaybinks/.condarc
populated config files :
conda version : 23.3.1
conda-build version : not installed
python version : 3.10.10.final.0
virtual packages : __archspec=1=arm64
__osx=13.2.1=0
__unix=0=0
base environment : /opt/homebrew/Caskroom/miniconda/base (writable)
conda av data dir : /opt/homebrew/Caskroom/miniconda/base/etc/conda
conda av metadata url : None
channel URLs : https://repo.anaconda.com/pkgs/main/osx-arm64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/osx-arm64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /opt/homebrew/Caskroom/miniconda/base/pkgs
/Users/jaybinks/.conda/pkgs
envs directories : /opt/homebrew/Caskroom/miniconda/base/envs
/Users/jaybinks/.conda/envs
platform : osx-arm64
user-agent : conda/23.3.1 requests/2.28.1 CPython/3.10.10 Darwin/22.3.0 OSX/13.2.1
UID:GID : 501:20
netrc file : None
offline mode : False
jaybinks@Jays-Mac-mini whisper.cpp % pip freeze
ane-transformers==0.1.1
certifi==2023.5.7
charset-normalizer==3.1.0
coremltools==6.3.0
ffmpeg-python==0.2.0
filelock==3.12.1
fsspec==2023.6.0
future==0.18.3
huggingface-hub==0.15.1
idna==3.4
Jinja2==3.1.2
llvmlite==0.40.1rc1
MarkupSafe==2.1.3
more-itertools==9.1.0
mpmath==1.3.0
networkx==3.1
numba==0.57.0
numpy==1.24.3
openai-whisper==20230314
packaging==23.1
protobuf==3.20.1
PyYAML==6.0
regex==2023.6.3
requests==2.31.0
safetensors==0.3.1
sympy==1.12
tiktoken==0.3.1
tokenizers==0.13.3
torch==2.0.0
tqdm==4.65.0
transformers==4.30.1
typing_extensions==4.6.3
urllib3==2.0.3
jaybinks@Jays-Mac-mini whisper.cpp %
I managed to get around this by downloading the coreml models from the URL here : https://github.com/ggerganov/whisper.cpp/pull/566
however it still might be good to know if this is a me problem, or if the instructions need to be refined some more.
@jaybinks I believe the error comes from using Python 3.11 which currently is not compatible with the coremltools master branch. https://github.com/apple/coremltools/issues/1730#issuecomment-1382615962
The solution is to either update to Coremltools 7.0b1 or downgrade to Python 3.10. I tried Coremltools 7.0b1 and it converted successfully.
I have to second that I ran into a similar problem, and I was on Python 3.10.11, followed the setup instruction to a T. The problem occurred when I tried to build the CoreML version of medium.en
:
whisper.cpp % ./models/generate-coreml-model.sh medium.en
Torch version 2.0.1 has not been tested with coremltools. You may run into unexpected errors. Torch 2.0.0 is the most recent version that has been tested.
Traceback (most recent call last):
File "<snip>/whisper.cpp/models/convert-whisper-to-coreml.py", line 10, in <module>
from ane_transformers.reference.layer_norm import LayerNormANE as LayerNormANEBase
ModuleNotFoundError: No module named 'ane_transformers.reference'
coremlc: error: Model does not exist at models/coreml-encoder-medium.en.mlpackage -- file:///<snip>/whisper.cpp/
mv: rename models/coreml-encoder-medium.en.mlmodelc to models/ggml-medium.en-encoder.mlmodelc: No such file or directory
Any insight would be appreciated.
I tried to find the CoreML model download link in https://github.com/ggerganov/whisper.cpp/pull/566 as suggested by @jaybinks, but I seem to get lost in that long PR thread so I would appreciate a more precise pointer to where I can find it...
I tried to find the CoreML model download link in #566 as suggested by @jaybinks, but I seem to get lost in that long PR thread so I would appreciate a more precise pointer to where I can find it...
I had the same error when using python 3.11. After using venv with python 3.10 issue has gone.
This works for me using Python 3.11
pip uninstall coremltools
pip install coremltools==7.0b2
./models/generate-coreml-model.sh base.en
Based on the above discussion this should absolutely work - but it does not. This is my conda environment on an Apple M1 Pro:
(python310) kpg-mcb:whisper.cpp kpg$ conda list
# packages in environment at /Users/kpg/miniconda3/envs/python310:
#
# Name Version Build Channel
ane-transformers 0.1.3 pypi_0 pypi
attrs 23.1.0 pypi_0 pypi
bzip2 1.0.8 h620ffc9_4
ca-certificates 2023.08.22 hca03da5_0
cattrs 23.1.2 pypi_0 pypi
certifi 2023.7.22 pypi_0 pypi
charset-normalizer 3.3.0 pypi_0 pypi
coremltools 7.0 pypi_0 pypi
exceptiongroup 1.1.3 pypi_0 pypi
filelock 3.12.4 pypi_0 pypi
fsspec 2023.9.2 pypi_0 pypi
huggingface-hub 0.17.3 pypi_0 pypi
idna 3.4 pypi_0 pypi
libffi 3.4.4 hca03da5_0
llvmlite 0.41.0 pypi_0 pypi
more-itertools 10.1.0 pypi_0 pypi
mpmath 1.3.0 pypi_0 pypi
ncurses 6.4 h313beb8_0
numba 0.58.0 pypi_0 pypi
numpy 1.25.2 pypi_0 pypi
openai-whisper 20230918 pypi_0 pypi
openssl 1.1.1w h1a28f6b_0
packaging 23.2 pypi_0 pypi
pip 23.2.1 py310hca03da5_0
protobuf 3.20.1 pypi_0 pypi
pyaml 23.9.7 pypi_0 pypi
python 3.10.0 hbdb9e5c_5
pyyaml 6.0.1 pypi_0 pypi
readline 8.2 h1a28f6b_0
regex 2023.10.3 pypi_0 pypi
requests 2.31.0 pypi_0 pypi
safetensors 0.4.0 pypi_0 pypi
setuptools 68.0.0 py310hca03da5_0
sqlite 3.41.2 h80987f9_0
sympy 1.12 pypi_0 pypi
tiktoken 0.3.3 pypi_0 pypi
tk 8.6.12 hb8d0fd4_0
tokenizers 0.14.1 pypi_0 pypi
torch 1.11.0 pypi_0 pypi
tqdm 4.66.1 pypi_0 pypi
transformers 4.34.0 pypi_0 pypi
typing-extensions 4.8.0 pypi_0 pypi
tzdata 2023c h04d1e81_0
urllib3 2.0.6 pypi_0 pypi
wheel 0.41.2 py310hca03da5_0
xz 5.4.2 h80987f9_0
zlib 1.2.13 h5a0b063_0
Error:
Converting PyTorch Frontend ==> MIL Ops: 100%|███████████████████████████████████████████████████████████████████████████████▊| 533/534 [00:00<00:00, 5391.68 ops/s]
Running MIL frontend_pytorch pipeline: 100%|████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:00<00:00, 576.57 passes/s]
Running MIL default pipeline: 100%|████████████████████████████████████████████████████████████████████████████████████████████| 57/57 [00:00<00:00, 81.39 passes/s]
Running MIL backend_mlprogram pipeline: 100%|████████████████████████████████████████████████████████████████████████████████| 10/10 [00:00<00:00, 2119.62 passes/s]
Traceback (most recent call last):
File "/Users/kpg/python/whisper.cpp/models/convert-whisper-to-coreml.py", line 323, in <module>
encoder = convert_encoder(hparams, encoder, quantize=args.quantize)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/kpg/python/whisper.cpp/models/convert-whisper-to-coreml.py", line 259, in convert_encoder
model = ct.convert(
^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/_converters_entry.py", line 492, in convert
mlmodel = mil_convert(
^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 188, in mil_convert
return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 212, in _mil_convert
proto, mil_program = mil_convert_to_proto(
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 303, in mil_convert_to_proto
out = backend_converter(prog, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 130, in __call__
return backend_load(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/backend/mil/load.py", line 283, in load
raise RuntimeError("BlobWriter not loaded")
RuntimeError: BlobWriter not loaded
coremlc: error: Model does not exist at models/coreml-encoder-base.en.mlpackage -- file:///Users/kpg/python/whisper.cpp/
mv: rename models/coreml-encoder-base.en.mlmodelc to models/ggml-base.en-encoder.mlmodelc: No such file or directory
Tne direct download to coreml models does not work any more, is there another location to download them?
pip install numpy==1.26.4 it's helpful for me
On a nearly brand-new mac mini M2, I get the following error trying to generate the coreml files.
Any ideas / suggestions would be appreciated.