rmihaylov / falcontune

Tune any FALCON in 4-bit
Apache License 2.0
468 stars 51 forks source link

Its working very well but 2 issues: #32

Open TeaCult opened 1 year ago

TeaCult commented 1 year ago

1) pyarrow gives error if the dataset json file > 10mb 2) I really cant figure out what all parameters are and how to resume when training is interrupted. Is there any reference for parameters and switches such as :

--data_type=alpaca \
--lora_out_dir=./falcon-7b-instruct-4bit-alpaca/ \
--mbatch_size=1 \
--batch_size=2 \
--epochs=3 \
--lr=3e-4 \
--cutoff_len=256 \
--lora_r=8 \
--lora_alpha=16 \
--lora_dropout=0.05 \
--warmup_steps=5 \
--save_steps=50 \
--save_total_limit=3 \
--logging_steps=5 \
--target_modules='["query_key_value"]' \

...

Thank you very much.