Open eduand-alvarez opened 4 weeks ago
Hi @eduand-alvarez, reuse_cache
was not enabled for Phi in v1.11.1. Can you install the library from source with
pip install git+https://github.com/huggingface/optimum-habana.git
to get this change and let me know if that works? The next stable release should be published soon.
Or, alternatively, if you prefer sticking to v1.11.1, you should run run_lm_eval.py
from tag v1.11.1
: https://github.com/huggingface/optimum-habana/blob/v1.11.1/examples/text-generation/run_lm_eval.py
System Info
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Getting the following error:
Traceback (most recent call last): File "/home/ubuntu/optimum-habana/examples/text-generation/run_lm_eval.py", line 187, in
main()
File "/home/ubuntu/optimum-habana/examples/text-generation/run_lm_eval.py", line 156, in main
lm = HabanaModelAdapter(tokenizer, model, args, generation_config)
File "/home/ubuntu/optimum-habana/examples/text-generation/run_lm_eval.py", line 91, in init
self.warm_up()
File "/home/ubuntu/optimum-habana/examples/text-generation/run_lm_eval.py", line 96, in warm_up
self._model_call(inps)
File "/home/ubuntu/optimum-habana/examples/text-generation/run_lm_eval.py", line 142, in _model_call
logits = self.model(inps.to(self._device), self.model_inputs)["logits"].cpu()
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1514, in _wrapped_call_impl
return self._call_impl(*args, *kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1564, in _call_impl
result = forward_call(args, kwargs)
File "/usr/local/lib/python3.10/dist-packages/habana_frameworks/torch/hpu/graphs.py", line 661, in forward
return wrapped_hpugraph_forward(cache, stream, orig_fwd, args, kwargs, disable_tensor_cache, asynchronous, dry_run, max_graphs)
File "/usr/local/lib/python3.10/dist-packages/habana_frameworks/torch/hpu/graphs.py", line 544, in wrapped_hpugraph_forward
outputs = orig_fwd(*args, **kwargs)
TypeError: GaudiPhiForCausalLM.forward() got an unexpected keyword argument 'reuse_cache'
Is Phi not supported or am I not using the right version of something?
Expected behavior
Should yield harness scores.