Open jaskirat8 opened 8 months ago
Same error... this doesn't work anymore. Creators are also not active.
+1
running into the same error while trying to generate by calling model.generate() method in the getting started colab notebook.
found relevant issue huggingface/transformers#10160
this seems to be an issue with the petals
library itself instead of the transformers
library since replacing AutoDistributedModelForCausalLM
with AutoModelForCausalLM
seems to work fine
@daspartho, I have the same thoughts as I have been using the same models for months directly, and it works. I just wanted to validate that it is not something due to my misconfiguration or overlooking some setting; we need to isolate the culprit and work towards PR since other folks are also facing this.
yes i agree!
also gently pinging @borzunov here
[working on it]
To get bootstrapped, I tried to use the example from Readme
The
torch_dtype=torch.float32
was added due to CPU support warning but apart from that rest is the same as the original example yet I am facing the error and unable to complete the inference.OS : Ubuntu 22.04 CPU : i7-7700K GPU: Nvidia 1070
Please guide if i am missing something here.