Closed pallavijaini0525 closed 7 months ago
any update on this issue please
Apologies for the delay. Unfortunately, we don't plan on supporting this package on CPU. You might have to use the model directly with transformers.
We have a tutorial with the quantized Bonito model (https://colab.research.google.com/drive/12OCh4OYo1vr9ZvwIWK4JwZT7rkMrYrx2?usp=sharing). You can run this on the Google Colab notebook. I hope this helps.
Thank you, will check it out.
I want to create the dataset in the CPU and realized the code is using the vLLM, how can make it run in the CPU and generate the dataset?
File "/home/spr/.local/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 109, in init self.llm_engine = LLMEngine.from_engine_args(engine_args) File "/home/spr/.local/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 386, in from_engine_args engine_configs = engine_args.create_engine_configs() File "/home/spr/.local/lib/python3.10/site-packages/vllm/engine/arg_utils.py", line 286, in create_engine_configs device_config = DeviceConfig(self.device) File "/home/spr/.local/lib/python3.10/site-packages/vllm/config.py", line 496, in init raise RuntimeError("No supported device detected.") RuntimeError: No supported device detected.