Closed lopezfelipe closed 6 days ago
Hello -- The sample for importing a custom model into Bedrock has a bug when loading the PEFT model.
The current script will finish training but will result in a model that Bedrock cannot load.
# load PEFT model model = AutoPeftModelForCausalLM.from_pretrained( training_args.output_dir, low_cpu_mem_usage=True, torch_dtype=torch_dtype )
The issue appears to be fixed when changing the torch_dtype definition to torch.float16
torch_dtype
torch.float16
fixed
Hello -- The sample for importing a custom model into Bedrock has a bug when loading the PEFT model.
The current script will finish training but will result in a model that Bedrock cannot load.
The issue appears to be fixed when changing the
torch_dtype
definition totorch.float16