Message=The current device_map had weights offloaded to the disk. Please provide an offload_folder for them. Alternatively, make sure you have safetensors installed if the model you are using offers the weights in this format.
Source=C:\Users\Administrator.cache\modelscope\modelscope_modules\Sunsimiao\ms_wrapper.py
StackTrace:
File "C:\Users\Administrator.cache\modelscope\modelscope_modules\Sunsimiao\ms_wrapper.py", line 42, in init
self.model = AutoModelForCausalLM.from_pretrained(model_dir, device_map="auto", trust_remote_code=True)
File "C:\Users\Administrator.cache\modelscope\modelscope_modules\Sunsimiao\ms_wrapper.py", line 20, in init (Current frame)
model = SunsimiaoTextGeneration(model) if isinstance(model, str) else model
File "C:\Users\Administrator\source\repos\Sunsimiao\scripts\inference_ms.py", line 4, in
pipe = pipeline(task=Tasks.text_generation,
class SunsimiaoTextGeneration(TorchModel): def init(self, model_dir=None, *args, *kwargs): super().init(model_dir, args, **kwargs) self.logger = get_logger()
loading tokenizer
Message=The current
pipe = pipeline(task=Tasks.text_generation,
device_map
had weights offloaded to the disk. Please provide anoffload_folder
for them. Alternatively, make sure you havesafetensors
installed if the model you are using offers the weights in this format. Source=C:\Users\Administrator.cache\modelscope\modelscope_modules\Sunsimiao\ms_wrapper.py StackTrace: File "C:\Users\Administrator.cache\modelscope\modelscope_modules\Sunsimiao\ms_wrapper.py", line 42, in init self.model = AutoModelForCausalLM.from_pretrained(model_dir, device_map="auto", trust_remote_code=True) File "C:\Users\Administrator.cache\modelscope\modelscope_modules\Sunsimiao\ms_wrapper.py", line 20, in init (Current frame) model = SunsimiaoTextGeneration(model) if isinstance(model, str) else model File "C:\Users\Administrator\source\repos\Sunsimiao\scripts\inference_ms.py", line 4, in