Closed saisoulpage closed 2 months ago
Hi @saisoulpage :wave:,
First you load the onnx versions of different models:
parseq
with crnn_vgg16_bn
onnx model
and same for the detection
linknet_resnet18
with fast_base
onnx model
reco_model = parseq("crnn_vgg16_bn-662979cc.onnx", vocab="ABC") # wrong onnx model
det_model = linknet_resnet18("rep_fast_base-1b89ebf9.onnx") # wrong onnx model
And you cannot simply change the vocab this would require your own custom trained model if you only want that the model predicts A,B or C :)
reco_model = parseq("crnn_vgg16_bn-662979cc.onnx", vocab="ABC") # works only with custom model
Custom model training ref.: https://mindee.github.io/doctr/using_doctr/custom_models_training.html Custom model export ref.: https://mindee.github.io/doctr/using_doctr/using_model_export.html#export-to-onnx
i need to load pretrained models that you are using manually that i need to keep in a models directory. could you please help me on it
How to fix ?
wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/parseq-00b40714.onnx
wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/linknet_resnet18-e0e0b9dc.onnx
reco_model = parseq("<PATH_TO>/parseq-00b40714.onnx")
det_model = linknet_resnet18("<PATH_TO>/linknet_resnet18-e0e0b9dc.onnx")
# Rest of your code
:)
BTW.:
You can set an environment varaible ONNXTR_CACHE_DIR
to any folder you want and the models will be downloaded into this location instead of the default ~/.cache/onnxtr/models
https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/rep_fast_base-1b89ebf9.onnx https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/crnn_vgg16_bn-662979cc.onnx
How load this models locally
How to fix ?
wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/parseq-00b40714.onnx wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/linknet_resnet18-e0e0b9dc.onnx
reco_model = parseq("<PATH_TO>/parseq-00b40714.onnx") det_model = linknet_resnet18("<PATH_TO>/linknet_resnet18-e0e0b9dc.onnx") # Rest of your code
:)
import os
os.environ["ONNXTR_CACHE_DIR"] = "<full_path_to_folder>"
On top of your script :) Now all downloaded models will be saved in the given location instead of the default one
import os os.environ["ONNXTR_CACHE_DIR"] = "<full_path_to_folder>"
On top of your script :) Now all downloaded models will be saved in the given location instead of the default one
Sorted it
https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/rep_fast_base-1b89ebf9.onnx https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/crnn_vgg16_bn-662979cc.onnx
How load this models locally
How to fix ?
wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/parseq-00b40714.onnx wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/linknet_resnet18-e0e0b9dc.onnx
reco_model = parseq("<PATH_TO>/parseq-00b40714.onnx") det_model = linknet_resnet18("<PATH_TO>/linknet_resnet18-e0e0b9dc.onnx") # Rest of your code
:)
Looking for this because paseq and linknet_resnet18 not working for above models
https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/rep_fast_base-1b89ebf9.onnx https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/crnn_vgg16_bn-662979cc.onnx How load this models locally
How to fix ?
wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/parseq-00b40714.onnx wget https://github.com/felixdittrich92/OnnxTR/releases/download/v0.0.1/linknet_resnet18-e0e0b9dc.onnx
reco_model = parseq("<PATH_TO>/parseq-00b40714.onnx") det_model = linknet_resnet18("<PATH_TO>/linknet_resnet18-e0e0b9dc.onnx") # Rest of your code
:)
Looking for this because paseq and linknet_resnet18 not working for above models
If you want to use the crnn and fast base onnx (from your initial code snippet) models you need:
from onnxtr.models import crnn_vgg16_bn, fast_base
And remove the vocab="ABC" :)
How many different types of languages will this models support example English. etc
one more question, i am still getting below error
| model = ocr_predictor(det_model=det_model, reco_model=reco_model,
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/zoo.py", line 121, in ocr_predictor
api_1 | return _predictor(
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/zoo.py", line 34, in _predictor
api_1 | det_predictor = detection_predictor(
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/detection/zoo.py", line 86, in detection_predictor
api_1 | return _predictor(arch, assume_straight_pages, load_in_8_bit, engine_cfg=engine_cfg, **kwargs)
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/detection/zoo.py", line 39, in _predictor
api_1 | _model = detection.__dict__[arch](
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/detection/models/fast.py", line 189, in fast_base
api_1 | return _fast("fast_base", model_path, load_in_8_bit, engine_cfg, **kwargs)
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/detection/models/fast.py", line 102, in _fast
api_1 | return FAST(model_path, cfg=default_cfgs[arch], engine_cfg=engine_cfg, **kwargs)
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/detection/models/fast.py", line 64, in __init__
api_1 | super().__init__(url=model_path, engine_cfg=engine_cfg, **kwargs)
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxtr/models/engine.py", line 97, in __init__
api_1 | self.runtime = InferenceSession(archive_path, providers=self.providers, sess_options=self.session_options)
api_1 | File "/usr/local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 349, in __init__
api_1 | raise TypeError("Unable to load from type '{0}'".format(type(path_or_bytes)))
api_1 | TypeError: Unable to load from type '<class 'pathlib.PosixPath'>'
Could you share a full code snippet which raises the error ? The default recognition models are trained on french vocab. But you could try also a multilingual model (https://huggingface.co/Felix92/onnxtr-parseq-multilingual-v1) - only available via hf atm
Bug description
hi @felixdittrich92 i am getting error while loading the model, i have attached the code below please check it once.
Code snippet to reproduce the bug
Error traceback
Environment
i am using colab with default python version