Closed l-k-11235 closed 3 months ago
Good catch. Forgot to fix this one post checkpoint format restructuring. Can you PR a patch similar to this? https://github.com/eole-nlp/eole/blob/60fbbe4ee06d1be93173e8f4c6f06b97266ec46b/eole/bin/convert/convert_HF.py#L885-L889
NB we shall add some tests for such features/tools at some point.
Sorry it does not seem to work, I had the same error with the patch.
Provide trace and diff please. Probably not the exact same error, as the object is supposed to have changed type.
The recursive_model_fields_set
function takes a pydantic config object and returns a nested dict with all the explicitly set fields from said config. So it's supposed to be json serializable.
What you might have is a similar error, but with another object.
Just tested the suggested patch (#28) and it seems to work fine. Please re-open with more details if other issues arise.
I got that error when I ran the command:
eole model lora --action merge --base_model ${EOLE_MODEL_DIR}/llama3-8b --lora_weights ./finetune/llama3-8b-finetune/step_10500 --output ./finetune/merged
(I just commented the related lines at the moment)