MrYxJ / calculate-flops.pytorch

The calflops is designed to calculate FLOPs、MACs and Parameters in all various neural networks, such as Linear、 CNN、 RNN、 GCN、Transformer(Bert、LlaMA etc Large Language Model)
https://pypi.org/project/calflops/
MIT License
393 stars 14 forks source link

error with the example code #14

Closed tranlm closed 4 months ago

tranlm commented 6 months ago

Hi. I'm trying to run the example code you've listed on the README.md, but am getting the following error. I'm wondering if there's a dependency I'm unaware of. I've already confirmed I have access to the llama models. Thanks.

from calflops import calculate_flops_hf model_name = "meta-llama/Llama-2-7b" access_token = "<MY API KEY" calculate_flops_hf(model_name=model_name, access_token=access_token) Traceback (most recent call last): File "", line 1, in File "/var/data/python/lib/python3.11/site-packages/calflops/flops_counter_hf.py", line 77, in calculate_flops_hf empty_model = create_empty_model(model_name=model_name, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/var/data/python/lib/python3.11/site-packages/calflops/estimate.py", line 99, in create_empty_model raise ValueError( ValueError: Model meta-llama/Llama-2-7b does not have any library metadata on the Hub, please manually pass in a --library_name to use (such as transformers)

MrYxJ commented 4 months ago

Hello, sorry, I have been busy recently. I just saw the issue, but it is not convenient for me to help you verify the api key you applied for llama. I recommend you to download the full weight of llama to the local and then use this tool to calculate FlOPs etc.