SkyworkAI / Skywork

Skywork series models are pre-trained on 3.2TB of high-quality multilingual (mainly Chinese and English) and code data. We have open-sourced the model, training data, evaluation data, evaluation methods, etc. 天工系列模型在3.2TB高质量多语言和代码数据上进行预训练。我们开源了模型参数,训练数据,评估数据,评估方法。
Other
1.21k stars 111 forks source link

Skywork 团队有兴趣推出一个 7B 的蒸馏版本以支持推测采样和低资源设备推理吗? #49

Open tq-xyy opened 9 months ago

tq-xyy commented 9 months ago

如题,据我所知,隔壁 CausalLM/7B 已经搞了,他们用的知识蒸馏的方法。

TianwenWei commented 9 months ago

现在7B的开源模型已经很多了,我们在考虑出开源一个3B的版本。