Closed lizhao-8202 closed 4 months ago
警告问题处理了。我这base_model由./saved model改成savedmodel合适吗。如果不能联网是需要离线啥文件吗
警告问题处理了。我这base_model由./saved model改成savedmodel合适吗。如果不能联网是需要离线啥文件吗
需要把https://huggingface.co/FreedomIntelligence/GrammarGPT/tree/main这个模型下载到savedmodel文件夹中即可。直接连接huggingface可能比较困难,建议寻找合理的方法去先下载好模型,放到目录里。
警告问题处理了。我这base_model由./saved model改成savedmodel合适吗。如果不能联网是需要离线啥文件吗
感谢
运行generate.py时, 有报错 huggingface_hub.utils.validators.HFValidationError: Repo id must use alphanumeric chars or '-', '', '.', '--' and '..' are forbidden, '-' and '.',cannot start or end the name, max length is 96: './saved model
警告信息: /opt/python3.10/python3/lib/python3.10/site-packages/torch/cuda/init.py:118: UserWarning: CUDA initialization: The NVIDIA driver on your system is too old (found version 10020). Please update your GPU driver by downloading and installing a new version from the URL: http://www.nvidia.com/Download/index.aspx Alternatively, go to: https://pytorch.org to install a PyTorch version that has been compiled with your version of the CUDA driver. (Triggered internally at ../c10/cuda/CUDAFunctions.cpp:108.) return torch._C._cuda_getDeviceCount() > 0 /opt/python3.10/python3/lib/python3.10/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " 'NoneType' object has no attribute 'cadam32bit_grad_fp32'
我把generate.py里的base_model由./saved model改成savedmodel后,报错(公司内网环境无法连接外网) OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like savedmodel is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
对于generate.py里base_model由./saved model改成savedmodel前的报错的警告(改配置后也有该警告)需要处理吗 对于改base_model后的报错,因为无法翻墙没法去提示的网站查看信息,不知道是不是只需要网上去哪儿找些文件就可以解决这错。