Lightning-AI / lit-llama

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
Apache License 2.0
6k stars 520 forks source link

`open in google colab` integration #305

Open Andrei-Aksionov opened 1 year ago

Andrei-Aksionov commented 1 year ago

HI there 👋

As a way to make this repo more accessible how about to add notebooks with Open In Colab button?

For now there is a howto folder with markdown files describing what steps are required and what outputs are expected. A notebook with such a button at the top will be much more convenient in my opinion. Plus it will help with cases when the code doesn't work on someone's machine - notebooks will be some sort of a reference point.


As a reminder how it can be accomplished here is another shameless plug of mine: link to a such notebook in my repo.

maadnfritz commented 1 year ago

agree

Andrei-Aksionov commented 1 year ago

After releasing QLoRA that even provide a notebook of how to run inference on a free tier of Google Colab I think that it's about time to implement something similar here too.

MartinKondor commented 1 year ago

I also think that would be amazing to have some notebooks.