microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
20.26k stars 2.56k forks source link

The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits #1650

Open hsb1995 opened 3 weeks ago

hsb1995 commented 3 weeks ago

Can the code for compressing large models be open sourced? Clicking on it will leave it empty