Closed bw-Deejee closed 3 months ago
Nevermind. I looked deeper in to repo changes and it looks like i just had to rerun:
pip install -r requirements.txt -c reqs_optional/reqs_constraints.txt
in the newest repo version and now it works.
Ya new transformers was required. Thanks.
Are llama3.1 models currently supported?
I pulled recent repo however, didn't go through the full installation requirements and the following doesn't work:
python generate.py --base_model=meta-llama/Meta-Llama-3.1-8B-Instruct --use_auth_token=...
Error message:
ValueError: 'rope_scaling' must be a dictionary with two fields, 'type' and 'factor', got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}