Closed hschaeufler closed 1 month ago
lora_layers
is ignored. It should be removed from the example config.
- Yes
lora_layers
is ignored. It should be removed from the example config.- We could have a check.. but then again.. it might be useful to let the user include other stuff in the config (some metadata for their own experiments).
Ok. Thank you. Should I create a PR for the example_config.yaml or do you want to do that?
That's a good point. I didn't realize it was intended to include metadata. I always created an extra json file with additional metadata.
I just realized by coincidence that I still had the old lora_layer-parameters in the config. Maybe a deprecation warning would have helped me to recognize earlier that the parameters had also changed in the yaml. But that's not meant to be a criticism. I am grateful for your work on MLX LM and think you are doing a great job. :)
Thanks for the fix! For the future a deprecation warning is the right call.. we'll be more careful there.
When updating the config yaml for the addition of full parameter tuning, the
lora_layers
parameter was not adjusted tonum_layers
.https://github.com/ml-explore/mlx-examples/blob/9000e280aeb56c2bcce128001ab157030095687a/llms/mlx_lm/examples/lora_config.yaml#L17
However, if the config is used, both values are later found in adapter_config.json. I assume that only the num_layers value is valid and lora_layers is only copied? I probably have to fine-tune my model again if I want to use the correct number of layers?
Maybe it also makes sense to include a check that only valid values are included in addition to updating the lora_config.yaml?
Are there any other values that I should update in the config? I have seen that use_dora has also been dropped?