toshiaki1729 / stable-diffusion-webui-text2prompt

Extension to generate prompt from simple text for SD web UI by AUTOMATIC1111
MIT License
162 stars 15 forks source link

Got error when loading model on mac M1 pro #2

Closed lxysl closed 1 year ago

lxysl commented 1 year ago
To create a public link, set `share=True` in `launch()`.
[text2prompt] Loading model with name "all-mpnet-base-v2"...
[text2prompt] Model loaded
Traceback (most recent call last):
  File "/Users/lxy/opt/miniconda3/envs/web-ui/lib/python3.10/site-packages/gradio/routes.py", line 337, in run_predict
    output = await app.get_blocks().process_api(
  File "/Users/lxy/opt/miniconda3/envs/web-ui/lib/python3.10/site-packages/gradio/blocks.py", line 1015, in process_api
    result = await self.call_function(
  File "/Users/lxy/opt/miniconda3/envs/web-ui/lib/python3.10/site-packages/gradio/blocks.py", line 833, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/Users/lxy/opt/miniconda3/envs/web-ui/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/Users/lxy/opt/miniconda3/envs/web-ui/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/Users/lxy/opt/miniconda3/envs/web-ui/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/Users/lxy/PycharmProjects/stable-diffusion-webui/extensions/text2prompt/scripts/main.py", line 57, in generate_prompt
    tags = wd_like(text, text_neg, neg_weight, pgen.GenerationSettings(tag_range, get_conversion(conversion), power, get_sampling(sampling), n, k, p, weighted))
  File "/Users/lxy/PycharmProjects/stable-diffusion-webui/extensions/text2prompt/scripts/t2p/prompt_generator/wd_like.py", line 111, in __call__
    tag_tokens_dev = torch.from_numpy(self.tokens).to(device)
TypeError: Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead.

There is an related issue https://github.com/pytorch/pytorch/issues/78168

toshiaki1729 commented 1 year ago

Thank you for reporting! Can you check if the commit a6c052ef7a8a7b029609c6f6447f0ac624f49d7a fix the issue or not? I don't have any mac environment…

lxysl commented 1 year ago

Thanks for your response and quick fix! This issue has been fixed after the newest commit was pulled!