turboderp / exllama

A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights.
MIT License
2.67k stars 214 forks source link

recover unsaved modification #250

Closed Kerushii closed 10 months ago

Kerushii commented 10 months ago

Recover data loss and unsaved changes caused by not_a_spy_for_openai mod abuse

vadi2 commented 10 months ago

What happened with this?

turboderp commented 10 months ago

Some Discord drama I think. I'm going to modify the example in a little bit, maybe incorporate some of the changes into generator.py instead. To be clear, there was no data loss on the github repo.

vadi2 commented 10 months ago

I'd love to see generator.py take on more functionality of examples - this is something I struggled with too, the amount of scaffolding that was necessary in practice to use it.