Closed gulaodeng closed 10 months ago
相关链接 https://github.com/SakuraLLM/Sakura-13B-Galgame https://github.com/HIllya51/LunaTranslator/blob/main/LunaTranslator/LunaTranslator/translator/sakura.py
可能稍微差点,但是可以离线跑
我的老笔记本连翻译姬都跑不动😢
Sakura模型后端可以用colab/kaggle的免费GPU配合内网穿透完成API部署,autogptq量化模型基本都能跑。
colab/kaggle部署后端API的Example:SakuraLLM-Notebooks
Sakura模型的API跟openai挺像的,或许可以直接套用GPT3/4的代码?
https://github.com/XD2333/GalTransl/releases/tag/3.1.0
相关链接 https://github.com/SakuraLLM/Sakura-13B-Galgame https://github.com/HIllya51/LunaTranslator/blob/main/LunaTranslator/LunaTranslator/translator/sakura.py
可能稍微差点,但是可以离线跑