geronimi73 / 3090_shorts

minimal LLM scripts for 24GB VRAM GPUs. training, inference, whatever
27 stars 2 forks source link