Open CreateTheImaginable opened 2 years ago
Got same error for several other models too, @hollance (julien-c/hotdog-not-hotdog
, savasy/bert-base-turkish-sentiment-cased
)
My pip list is the following:
Package Version Editable project location
------------------ --------- -----------------------------------
certifi 2022.6.15
charset-normalizer 2.1.1
coremltools 5.2.0
exporters 0.0.1 /Users/gibbon/Desktop/exporters/src
filelock 3.8.0
huggingface-hub 0.9.1
idna 3.3
mpmath 1.2.1
numpy 1.23.2
packaging 21.3
Pillow 9.2.0
pip 22.1.1
protobuf 3.20.0
pyparsing 3.0.9
PyYAML 6.0
regex 2022.8.17
requests 2.28.1
setuptools 62.3.2
sympy 1.11.1
tokenizers 0.12.1
torch 1.12.1
tqdm 4.64.1
transformers 4.21.3
typing_extensions 4.3.0
urllib3 1.26.12
Oops, actually, downgrading torch to 1.11.0
(as mentioned as a warning in coremltools
) did the trick
Does it fix your issue too @CreateTheImaginable?
Oops, actually, downgrading torch to
1.11.0
(as mentioned as a warning incoremltools
) did the trickDoes it fix your issue too @CreateTheImaginable?
@julien-c downgrading to 1.11.0 using "pip install torch==1.11.0" worked like a charm! Believe it or not I was on the bleeding edge using one of the PyTorch nightly builds, version 1.13.0.dev20220822, in order to try to take maximum advantage of Apple's M1 Max GPU using PyTorch Accelerate machine learning with Metal - WWDC22. I did not think to downgrade to a more stable version.
Often you can ignore these version warnings but I guess this time they weren't kidding. :-) Glad to hear you got it solved.
Another thing you can try is pip install -U coremltools==6.0b2
to get the latest version of coremltools. Sometimes that fixes issues too.
I get a gelu Value Error when trying to convert a distilbert-base-uncased-squad2' model. I also get the same error with the full Bert model bert-large-cased-whole-word-masking-finetuned-squad. It is that the CoreML converter cannot handle 2 inputs, one input for the "question" and another input for the "context"? How can this be fixed?
from transformers import AutoTokenizer, AutoModelForQuestionAnswering import torch
tokenizer = AutoTokenizer.from_pretrained('twmkn9/distilbert-base-uncased-squad2') model = AutoModelForQuestionAnswering.from_pretrained('twmkn9/distilbert-base-uncased-squad2', torchscript=True)
tokenizer.save_pretrained("local-pt-checkpoint") model.save_pretrained("local-pt-checkpoint")
Command Line> python -m exporters.coreml --model=twmkn9/distilbert-base-uncased-squad2 --feature=question-answering local-pt-checkpoint/
ValueError: node input.19 (gelu) got 2 input(s), expected [1]