taishan1994 / Llama3.1-Finetuning

对llama3进行全参微调、lora微调以及qlora微调。
Apache License 2.0
149 stars 15 forks source link

请问有模型评估代码吗? #8

Closed Lavenderlyu closed 2 months ago

Lavenderlyu commented 4 months ago

您好!关于此系列代码想问有没有对微调后模型进行微调,基于ceval-exam,或mmlu等评估数据集的评估代码或shell文件?

taishan1994 commented 4 months ago

没有额,这个可以参考他们的官方的评测代码。

Lavenderlyu commented 4 months ago

谢谢!!再请问一下您环境中的bitsandbytes==0.43.0是怎么安装的吗?我其他库都跟requirement一样只有bitsandbytes显示:(llama3_ft) [xylv@server01 script]$ pip install bitsandbytes==0.43.0 ERROR: Could not find a version that satisfies the requirement bitsandbytes==0.43.0 (from versions: 0.31.8, 0.32.0, 0.32.1, 0.32.2, 0.32.3, 0.33.0, 0.33.1, 0.34.0, 0.35.0, 0.35.1, 0.35.2, 0.35.3, 0.35.4, 0.36.0, 0.36.0.post1, 0.36.0.post2, 0.37.0, 0.37.1, 0.37.2, 0.38.0, 0.38.0.post1, 0.38.0.post2, 0.38.1, 0.39.0, 0.39.1, 0.40.0, 0.40.0.post1, 0.40.0.post2, 0.40.0.post3, 0.40.0.post4, 0.40.1, 0.40.1.post1, 0.40.2, 0.41.0, 0.41.1, 0.41.2, 0.41.2.post1, 0.41.2.post2, 0.41.3, 0.41.3.post1, 0.41.3.post2, 0.42.0),python版本Python 3.8.19 ERROR: No matching distribution found for bitsandbytes==0.43.0 Name: bitsandbytes Version: 0.42.0 Summary: k-bit optimizers and matrix multiplication routines. Home-page: https://github.com/TimDettmers/bitsandbytes Author: Tim Dettmers Author-email: dettmers@cs.washington.edu License: MIT Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: scipy Required-by:

Name: pydantic Version: 1.10.6 Summary: Data validation and settings management using python type hints Home-page: https://github.com/pydantic/pydantic Author: Samuel Colvin Author-email: s@muelcolvin.com License: MIT Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: typing-extensions Required-by: deepspeed

Name: modelscope Version: 1.13.3 Summary: ModelScope: bring the notion of Model-as-a-Service to life. Home-page: https://github.com/modelscope/modelscope Author: ModelScope team Author-email: contact@modelscope.cn License: Apache License 2.0 Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: addict, attrs, datasets, einops, filelock, gast, huggingface-hub, numpy, oss2, pandas, Pillow, pyarrow, python-dateutil, pyyaml, requests, scipy, setuptools, simplejson, sortedcontainers, tqdm, urllib3, yapf Required-by:

Name: datasets Version: 2.18.0 Summary: HuggingFace community-driven open-source library of datasets Home-page: https://github.com/huggingface/datasets Author: HuggingFace Inc. Author-email: thomas@huggingface.co License: Apache 2.0 Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: aiohttp, dill, filelock, fsspec, huggingface-hub, multiprocess, numpy, packaging, pandas, pyarrow, pyarrow-hotfix, pyyaml, requests, tqdm, xxhash Required-by: modelscope

Name: flash-attn Version: 2.5.6 Summary: Flash Attention: Fast and Memory-Efficient Exact Attention Home-page: https://github.com/Dao-AILab/flash-attention Author: Tri Dao Author-email: trid@cs.stanford.edu License: Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: einops, ninja, packaging, torch Required-by:

Name: Jinja2 Version: 3.1.3 Summary: A very fast and expressive template engine. Home-page: https://palletsprojects.com/p/jinja/ Author: Author-email: License: BSD-3-Clause Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: MarkupSafe Required-by: torch

Name: numpy Version: 1.24.4 Summary: Fundamental package for array computing in Python Home-page: https://www.numpy.org Author: Travis E. Oliphant et al. Author-email: License: BSD-3-Clause Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: Required-by: accelerate, datasets, deepspeed, modelscope, pandas, peft, pyarrow, scipy, transformers

Name: peft Version: 0.5.0 Summary: Parameter-Efficient Fine-Tuning (PEFT) Home-page: https://github.com/huggingface/peft Author: The HuggingFace team Author-email: sourab@huggingface.co License: Apache Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: accelerate, numpy, packaging, psutil, pyyaml, safetensors, torch, tqdm, transformers Required-by:

Name: accelerate Version: 0.28.0 Summary: Accelerate Home-page: https://github.com/huggingface/accelerate Author: The HuggingFace team Author-email: zach.mueller@huggingface.co License: Apache Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: huggingface-hub, numpy, packaging, psutil, pyyaml, safetensors, torch Required-by: peft

Name: deepspeed Version: 0.9.4 Summary: DeepSpeed library Home-page: http://deepspeed.ai Author: DeepSpeed Team Author-email: deepspeed-info@microsoft.com License: Apache Software License 2.0 Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: hjson, ninja, numpy, packaging, psutil, py-cpuinfo, pydantic, torch, tqdm Required-by:

Name: transformers Version: 4.38.2 Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow Home-page: https://github.com/huggingface/transformers Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors) Author-email: transformers@huggingface.co License: Apache 2.0 License Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm Required-by: peft

Name: torch Version: 2.0.1 Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration Home-page: https://pytorch.org/ Author: PyTorch Team Author-email: packages@pytorch.org License: BSD-3 Location: /home/xylv/anaconda3/envs/llama3_ft/lib/python3.8/site-packages Requires: filelock, jinja2, networkx, nvidia-cublas-cu11, nvidia-cuda-cupti-cu11, nvidia-cuda-nvrtc-cu11, nvidia-cuda-runtime-cu11, nvidia-cudnn-cu11, nvidia-cufft-cu11, nvidia-curand-cu11, nvidia-cusolver-cu11, nvidia-cusparse-cu11, nvidia-nccl-cu11, nvidia-nvtx-cu11, sympy, triton, typing-extensions Required-by: accelerate, deepspeed, flash-attn, peft, triton

taishan1994 commented 4 months ago

也是pip安装的,不行的话就安装显示的最新的看看。 或者去github上找到其官网从代码进行安装。