Powerkrieger / NobbyGPT

Reimplementation of nanoGPT for educational purposes
0 stars 0 forks source link

2. Pick and setup English LLM #2

Closed Kuckuck44 closed 9 months ago

Kuckuck44 commented 10 months ago
  1. Pick model (ideally informed decision)
  2. Test, whether model can be fine-tuned (use with a Test-Implementation and Test-Dataset → PoC)
  3. Get implementation ready to accept real data, e.g. from weeve
Powerkrieger commented 10 months ago

A blog post about running llama 2 on a local medium tier gpu:

https://medium.com/innova-technology/efficient-fine-tuning-of-quantized-llama-2-da383228ee1e

Powerkrieger commented 10 months ago

Generally llama 2 resources:

https://github.com/facebookresearch/llama-recipes/

Powerkrieger commented 10 months ago

A notebook about utilizing your own data with llama 2. There is also a company mentioned that rents out GPUs in a cool fashion. brev.dev

https://colab.research.google.com/github/brevdev/notebooks/blob/main/llama2-finetune-own-data.ipynb

Powerkrieger commented 9 months ago

code is running on gcp instance, issue solved