Open tairov opened 8 months ago
Hi. Thanks for this port. I was trying to inference the babyllama, but seems that this port doesn't support it anymore? Like the original llama2.c compatible stories*.bin model. How can I do this ?
llama2.c
stories*.bin
Currently it doesn't support that model, I'll add a way to run it.
Hi. Thanks for this port. I was trying to inference the babyllama, but seems that this port doesn't support it anymore? Like the original
llama2.c
compatiblestories*.bin
model. How can I do this ?