baudm / parseq

Scene Text Recognition with Permuted Autoregressive Sequence Models (ECCV 2022)
https://huggingface.co/spaces/baudm/PARSeq-OCR
Apache License 2.0
565 stars 126 forks source link

Hosting pretrained models on Hugging Face #33

Open NimaBoscarino opened 2 years ago

NimaBoscarino commented 2 years ago

Hey there! In addition to having the pretrained models on Torch Hub, would there be any interest in mirroring the checkpoints on the Hugging Face Hub? We have docs for how to upload models, but I'm also happy to help out with it!

baudm commented 2 years ago

Thanks for your inquiry. I've actually looked at the Hugging Face Hub when I created the Gradio demo. Totally forgot about it but I'll take a look again week after next. I'll be adding weights from other configurations as well as TorchScript models.

baudm commented 2 years ago

Models have been uploaded to https://huggingface.co/baudm/

baudm commented 2 years ago

TODO: need to update documentation

temiwale88 commented 11 months ago

Hi @baudm, Just wanted to keep to this related thread: could you please make it easier to access the original pretrained weights from your Gradio demo using Huggingface (with How-to-use)? I'm struggling to install this repo on my Windows + cpu machine.

Sincere thanks to you for your work here. It's the best model I've experimented with for my use case. Elijah

temiwale88 commented 11 months ago

Hi @baudm, Just wanted to keep to this related thread: could you please make it easier to access the original pretrained weights from your Gradio demo using Huggingface (with How-to-use)? I'm struggling to install this repo on my Windows + cpu machine.

Sincere thanks to you for your work here. It's the best model I've experimented with for my use case. Elijah

Updated: I fixed installation by using conda to install torch etc instead of pip: conda install pytorch==1.10.1 torchvision==0.11.2 torchaudio==0.10.1 cpuonly -c pytorch. But I would still love a Huggingface implementation if it's possible. You've done more than enough if you can't at this time. Thanks!