Open karpathy opened 5 months ago
Most inference frameworks including vllm and llama.cpp support the safetensors format.
In theory, we can write a utility python script:
llm.c
;state_dict
;from safetensors.torch import save_file
save_file(state_dict, 'model.safetensors')
This will also help us use libraries such as lighteval to perform broad evaluations across more benchmarks.
@YuchenJin yep exactly what I had in mind! I put up the issue because I am sequencing other things before I get around to it, possibly someone can pick it up in parallel before.
Cool, I will give it a shot if no one starts working on it by mid-next week. :)
I'd be very interested in how we could take llm.c models and export them into universal formats, e.g. for very fast inference in llama.cpp, vllm, or etc. Or how they could be made HuggingFace compatible. This would also allow us to run more comprehensive evals on the models that we train in llm.c, because it would (hopefully) slot into other existing infrastructure in those projects.