Open llCurious opened 1 year ago
I try to run LLAMA using EasyLM. I follow the README for llama. The first step is conver raw LLAMA parameters.
python -m EasyLM.models.llama.convert_torch_to_easylm.py \ --checkpoint_dir='path/to/torch/llama/checkpoint' \ --output_dir='path/to/output/easylm/checkpoint' \ --streaming=True
The arg output_dir does not appear in convert_torch_to_easylm.py, which should be output_file now, as shown in code.
convert_torch_to_easylm.py
output_file
I wonder if the doc is outdated?
Yeah the doc is a bit outdated due to a lot of changes I made recently. I will try to update it more frequently.
I try to run LLAMA using EasyLM. I follow the README for llama. The first step is conver raw LLAMA parameters.
The arg output_dir does not appear in
convert_torch_to_easylm.py
, which should beoutput_file
now, as shown in code.I wonder if the doc is outdated?