Open codesoap opened 1 year ago
The 13B model is now fully trained as well:
Hi there,
A viable alternative to this project:
Good day! :)
@limcheekin I was not aware of those models. Thanks! I'm checking them out right now.
Though I must say, that I really liked the ### Input:
option of OpenAlpaca. The open-instruct models don't seem to be trained for this.
hey guys! Great work! How much VRAM do we need for inference?
JFYI: Pretraining for the 3B and 7B models is complete:
PS: Training for a 13B model has also begun: https://huggingface.co/openlm-research/open_llama_13b_600bt