Closed ryj0902 closed 5 months ago
Solved by adding chat_template: llama3
to the config file.
Previously, even if chat_template
was not declared in config, the register_llama3_template ()
function was called through the else syntax.
However, after the code was modified, the variable was essential.
diff --git a/src/axolotl/cli/preprocess.py b/src/axolotl/cli/preprocess.py
index a95427d..e7b3596 100644
--- a/src/axolotl/cli/preprocess.py
+++ b/src/axolotl/cli/preprocess.py
@@ -39,21 +39,22 @@ def do_cli(config: Union[Path, str] = Path("examples/"), **kwargs):
return_remaining_strings=True
)
- if parsed_cfg.chat_template == "chatml" and parsed_cfg.default_system_message:
- LOG.info(
- f"ChatML set. Adding default system message: {parsed_cfg.default_system_message}"
- )
- register_chatml_template(parsed_cfg.default_system_message)
- else:
- register_chatml_template()
-
- if parsed_cfg.chat_template == "llama3" and parsed_cfg.default_system_message:
- LOG.info(
- f"LLaMA-3 set. Adding default system message: {parsed_cfg.default_system_message}"
- )
- register_llama3_template(parsed_cfg.default_system_message)
- else:
- register_llama3_template()
+ if parsed_cfg.chat_template == "chatml":
+ if parsed_cfg.default_system_message:
+ LOG.info(
+ f"ChatML set. Adding default system message: {parsed_cfg.default_system_message}"
+ )
+ register_chatml_template(parsed_cfg.default_system_message)
+ else:
+ register_chatml_template()
+ elif parsed_cfg.chat_template == "llama3":
+ if parsed_cfg.default_system_message:
+ LOG.info(
+ f"LLaMA-3 set. Adding default system message: {parsed_cfg.default_system_message}"
+ )
+ register_llama3_template(parsed_cfg.default_system_message)
+ else:
+ register_llama3_template()
mine looks like this
if parsed_cfg.chat_template == "chatml":
if parsed_cfg.default_system_message:
LOG.info(
f"ChatML set. Adding default system message: {parsed_cfg.default_system_message}"
)
register_chatml_template(parsed_cfg.default_system_message)
else:
register_chatml_template()
elif parsed_cfg.chat_template == "llama3":
if parsed_cfg.default_system_message:
LOG.info(
f"LLaMA-3 set. Adding default system message: {parsed_cfg.default_system_message}"
)
register_llama3_template(parsed_cfg.default_system_message)
else:
register_llama3_template()
they are basically the same
nvm i changed it to yours and it worked
Please check that this issue hasn't been reported before.
Expected Behavior
python -m axolotl.cli.preprocess test.yaml --debug
should be success like below: (different dataset, executed before #1553 is merged)Current behaviour
fail with error messages below:
Interestingly, training command below runs fine without any errors.
accelerate launch -m axolotl.cli.train configs/test.yaml
Steps to reproduce
I don't think the data is important, but I've attached example data below. (some of val.jsonl):
run
python -m axolotl.cli.preprocess test.yaml --debug
Config yaml
Possible solution
No response
Which Operating Systems are you using?
Python Version
3.10.12
axolotl branch-commit
main/2147cf68 Llama3 dpo (#1610)
Acknowledgements