ymcui / Chinese-LLaMA-Alpaca

中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
https://github.com/ymcui/Chinese-LLaMA-Alpaca/wiki
Apache License 2.0
18.23k stars 1.86k forks source link

resize token embedding的时候会tied input output embedding吗 #822

Closed NonvolatileMemory closed 1 year ago

NonvolatileMemory commented 1 year ago

提交前必须检查以下项目

问题类型

模型训练与精调

基础模型

LLaMA-7B

操作系统

None

详细描述问题

# 请在此处粘贴运行代码(如没有可删除该代码块)

我看您使用了model.resize_token_embeddings来增加中文, 但是这个方法根据huggingface中的源码好像会直接tied input和output的embedding.

但是llama好像并不会把这两个embedding给共享, 请问我的理解有错误吗

依赖情况(代码类问题务必提供)

# 请在此处粘贴依赖情况

运行日志或截图

# 请在此处粘贴运行日志
airaria commented 1 year ago

实际不会tie embedding的,可以放心使用。

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

github-actions[bot] commented 1 year ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.