Lightning-AI / litgpt

Pretrain, finetune, deploy 20+ LLMs on your own data. Uses state-of-the-art techniques: flash attention, FSDP, 4-bit, LoRA, and more.
https://lightning.ai
Apache License 2.0
6.85k stars 726 forks source link

Conversion to HF checkpoint should generate a checkpoint format that can be loaded directly #1359

Open awaelchli opened 3 weeks ago

awaelchli commented 3 weeks ago

The conversion we have with litgpt convert to a huggingface checkpoint creates a model.pth file. But then you have to load it like so as described in the tutorial:

import torch
from transformers import AutoModel

state_dict = torch.load("output_dir/model.pth")
model = AutoModel.from_pretrained(
    "output_dir/", local_files_only=True, state_dict=state_dict
)

But we should make it work like this:

model = AutoModel.from_pretrained("output_dir")

The only blocker for this is that from_pretrained requires the pytorch_model.bin to be loaded with weights_only=True. Our checkpoints don't satisfy this constraint, because we save checkpoints using the incremental pickle save. See #1357 for more context where we had to work around this.

rasbt commented 3 weeks ago

Yes I agree, there were a few people on Discord recently struggling with this