Open dilerbatu opened 3 months ago
Hey Everyone,
I was trying to finetune gemma-2-2b-it with my local PC which has got A3000 GPU inside. I have followed conda install method.
This is my trainer:
``trainer = SFTTrainer( model = model, tokenizer = tokenizer, train_dataset = dataset, dataset_text_field = "text", max_seq_length = max_seq_length, dataset_num_proc = 2, packing = False, # Can make training 5x faster for short sequences. args = TrainingArguments( per_device_train_batch_size = 2, gradient_accumulation_steps = 4, warmup_steps = 5,
num_train_epochs = 10, learning_rate = 2e-4, fp16 = not is_bfloat16_supported(), bf16 = is_bfloat16_supported(), logging_steps = 1, optim = "adamw_8bit", weight_decay = 0.01, lr_scheduler_type = "linear", seed = 3407, output_dir = "outputs", ),
)``
When I start the training process, I got this error "TypeError: is_bf16_supported() got an unexpected keyword argument 'including_emulation'"
Did anyone see this error before ? Thanks.
Just fixed apologies - please reinstall Unsloth via
pip uninstall unsloth -y pip install --upgrade --no-cache-dir "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
Hey Everyone,
I was trying to finetune gemma-2-2b-it with my local PC which has got A3000 GPU inside. I have followed conda install method.
This is my trainer:
``trainer = SFTTrainer( model = model, tokenizer = tokenizer, train_dataset = dataset, dataset_text_field = "text", max_seq_length = max_seq_length, dataset_num_proc = 2, packing = False, # Can make training 5x faster for short sequences. args = TrainingArguments( per_device_train_batch_size = 2, gradient_accumulation_steps = 4, warmup_steps = 5,
max_steps = 60,
)``
When I start the training process, I got this error "TypeError: is_bf16_supported() got an unexpected keyword argument 'including_emulation'"
Did anyone see this error before ? Thanks.