dvmazur / mixtral-offloading

Run Mixtral-8x7B models in Colab or consumer desktops
MIT License
2.29k stars 227 forks source link

Implementation of benchmarks (C4 perplexity, Wikitext perplexity) #33

Open ChengSashankh opened 7 months ago

ChengSashankh commented 7 months ago

Hey,

Great repo! I'm trying to reproduce some of the benchmarks in your technical report, but having trouble evaluating it. Would you be able to share your code for evaluating the perplexity score of the model on C4, etc.?

Thanks!