jina-ai / jerboa

LLM finetuning
Apache License 2.0
42 stars 4 forks source link

feat: add more layers to lora finetuning #43

Closed alaeddine-13 closed 1 year ago

samsja commented 1 year ago

should this be a default argument ? IMO we should just change it at runtime with the cli

or maybe we add a "full" parameter that add all of the layer