When trying the model on a m1 with the webview I get the following error: RuntimeError: User specified an unsupported autocast device_type 'mps'
$ python -m detikzify.webui --light
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
/Users/robin/IdeaProjects/playground/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
warnings.warn(
You are using a model of type detikzify to instantiate a model of type . This is not supported for all configurations of models and can yield errors.
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
....
RuntimeError: User specified an unsupported autocast device_type 'mps'
pip show transformers
Name: transformers
Version: 4.38.2
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: /Users/robin/IdeaProjects/playground/venv/lib/python3.12/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: detikzify
Installing 4.38.1 fixes the issues and I can run the model (Very slowly):
When trying the model on a m1 with the webview I get the following error: RuntimeError: User specified an unsupported autocast device_type 'mps'
According to this post (https://github.com/oobabooga/text-generation-webui/issues/5695#issuecomment-1992717978) this is a issue with transformers==4.38.2
Installing 4.38.1 fixes the issues and I can run the model (Very slowly):