Closed JackeyWang777 closed 5 months ago
from calflops import calculate_flops from transformers import GemmaTokenizer from transformers import GemmaForCausalLM
batch_size, max_seq_length = 1, 128
model_save = "/data/code_data/share_package/models/gemma-1.1-7b-it/" model = GemmaForCausalLM.from_pretrained(model_save) tokenizer = GemmaTokenizer.from_pretrained(model_save) flops, macs, params = calculate_flops(model=model, input_shape=(batch_size, max_seq_length), transformer_tokenizer=tokenizer) print("FLOPs:%s MACs:%s Params:%s \n" %(flops, macs, params))
这个work了
感谢您的代码,但是我在复现到Gemma的时候 params为0,是否是不支持呢?
Total Training Params: 0
fwd MACs: 1.1 TMACs fwd FLOPs: 2.19 TFLOPS fwd+bwd MACs: 3.29 TMACs fwd+bwd FLOPs: 6.58 TFLOPS