BatsResearch / bonito

A lightweight library for generating synthetic instruction tuning datasets for your data without GPT.
BSD 3-Clause "New" or "Revised" License
693 stars 46 forks source link

Can this run on the CPU as vLLM is being used #17

Closed pallavijaini0525 closed 7 months ago

pallavijaini0525 commented 7 months ago

I want to create the dataset in the CPU and realized the code is using the vLLM, how can make it run in the CPU and generate the dataset?

bonito = Bonito("BatsResearch/bonito-v1")

File "/home/spr/.local/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 109, in init self.llm_engine = LLMEngine.from_engine_args(engine_args) File "/home/spr/.local/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 386, in from_engine_args engine_configs = engine_args.create_engine_configs() File "/home/spr/.local/lib/python3.10/site-packages/vllm/engine/arg_utils.py", line 286, in create_engine_configs device_config = DeviceConfig(self.device) File "/home/spr/.local/lib/python3.10/site-packages/vllm/config.py", line 496, in init raise RuntimeError("No supported device detected.") RuntimeError: No supported device detected.

pallavijaini0525 commented 7 months ago

any update on this issue please

nihalnayak commented 7 months ago

Apologies for the delay. Unfortunately, we don't plan on supporting this package on CPU. You might have to use the model directly with transformers.

We have a tutorial with the quantized Bonito model (https://colab.research.google.com/drive/12OCh4OYo1vr9ZvwIWK4JwZT7rkMrYrx2?usp=sharing). You can run this on the Google Colab notebook. I hope this helps.

pallavijaini0525 commented 7 months ago

Thank you, will check it out.