Open myBigbug opened 1 month ago
我已经使用命令 pip install -i https://pypi.org/simple/ bitsandbytes ,之后仍然报错
由于bitsandbytes (see issse)现在还不支持MPS,int4模型暂时无法在mac运行。
由于bitsandbytes (see issse)现在还不支持MPS,int4模型暂时无法在mac运行。
感谢解答,等支持后再Close该issue
由于bitsandbytes (see issse)现在还不支持MPS,int4模型暂时无法在mac运行。
请问目前支持win10吗?我setup了requirements和bitsandbytes,都提示 raise ImportError(
ImportError: Using bitsandbytes
8-bit quantization requires Accelerate: pip install accelerate
and the latest version of bitsandbytes: pip install -i https://pypi.org/simple/ bitsandbytes
由于bitsandbytes (see issse)现在还不支持MPS,int4模型暂时无法在mac运行。
请问目前支持win10吗?我setup了requirements和bitsandbytes,都提示 raise ImportError( ImportError: Using
bitsandbytes
8-bit quantization requires Accelerate:pip install accelerate
and the latest version of bitsandbytes:pip install -i https://pypi.org/simple/ bitsandbytes
bitsandbytes 目前仅支持cuda设备
非常感谢
在 2024-06-17 20:37:17,"Hongji Zhu" @.***> 写道:
由于bitsandbytes (see issse)现在还不支持MPS,int4模型暂时无法在mac运行。
请问目前支持win10吗?我setup了requirements和bitsandbytes,都提示 raise ImportError( ImportError: Using bitsandbytes 8-bit quantization requires Accelerate: pip install accelerate and the latest version of bitsandbytes: pip install -i https://pypi.org/simple/ bitsandbytes
bitsandbytes 目前仅支持cuda设备
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
+1
evaluation_strategy
is deprecated and will be removed in version 4.46 of 🤗 Transformers. Use eval_strategy
instead
warnings.warn(
[2024-07-16 11:23:00,796] [INFO] [comm.py:637:init_distributed] cdb=None
[2024-07-16 11:23:00,796] [INFO] [comm.py:668:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
Unused kwargs: ['_load_in_4bit', '_load_in_8bit', 'quant_method']. These kwargs are not used in <class 'transformers.utils.quantization_config.BitsAndBytesConfig'>.
low_cpu_mem_usage
was None, now set to True since model is quantized.
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████████████████████████| 2/2 [00:35<00:00, 17.78s/it]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Currently using LoRA for fine-tuning the MiniCPM-V model.
Traceback (most recent call last):
File "/mnt/c/Users/akhil/OneDrive/Desktop/WITBE_INTA/llm/MiniCPM-V-main/finetune/finetune.py", line 328, in Failures:
Also I have set the correct path for json dataset
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
我已经在conda环境中安装了所需要的依赖,在本地mac上执行PYTORCH_ENABLE_MPS_FALLBACK=1 python xxx.py命令的时候,得到了报错。
期望行为 | Expected Behavior
得到输出
复现方法 | Steps To Reproduce
执行PYTORCH_ENABLE_MPS_FALLBACK=1 python xxx.py
运行环境 | Environment
备注 | Anything else?
日志: