Open hhllxx1121 opened 1 year ago
遇到了相同问题
将下载的custom_autotune.py文件放到对应的文件夹中,参考~/.cache/huggingface/modules/transformers_modules/local/
把quantization.py 里的
try: import triton import triton.language as tl from .custom_autotune import * except:
改成
try: import triton import triton.language as tl from custom_autotune import * except:
将下载的custom_autotune.py文件放到对应的文件夹中,参考~/.cache/huggingface/modules/transformers_modules/local/
放进去也不行.....
把quantization.py 里的
try: import triton import triton.language as tl from .custom_autotune import * except:
改成
try: import triton import triton.language as tl from custom_autotune import * except:
修改后还是会报错
~/.cache/huggingface/modules/transformers_modules/local/
我手动下载了模型并且手动创建了一个fnlp文件夹,直接把下载的模型放入fnlp文件夹,请问 ~/.cache/huggingface/modules/transformers_modules/local/ 这个路径在哪里?我没有找到.cache文件夹
~/.cache/huggingface/modules/transformers_modules/local/
我手动下载了模型并且手动创建了一个fnlp文件夹,直接把下载的模型放入fnlp文件夹,请问 ~/.cache/huggingface/modules/transformers_modules/local/ 这个路径在哪里?我没有找到.cache文件夹
在linux下直接cd ~/.cache/huggingface/modules/transformers_modules就行,但是我这边没有local文件
https://github.com/linonetwo/MOSS-DockerFile
我在 dockerfile 里把这些问题都解决了,相关笔记 https://onetwo.ren/wiki/#调研GPU上运行的语言模型
运行下面这段代码出错:
pip show triton Name: triton Version: 2.0.0 Summary: A language and compiler for custom Deep Learning operations Home-page: https://github.com/openai/triton/ Author: Philippe Tillet Author-email: phil@openai.com License: Location: /opt/miniconda3/envs/moss/lib/python3.8/site-packages Requires: cmake, filelock, lit, torch Required-by: torch