Closed AceCHQ closed 8 months ago
试试 tokenizer.add_tokens()
,传入一个 list
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.
提交前必须检查以下项目
问题类型
其他问题
基础模型
None
操作系统
Linux
详细描述问题
您好,看到您使用sentencepiece训练了中文预料并且叠加起来,这部分我可以跑通。请问如果只想基于原始tokenizer增加special token,应该如何进行?是否有类似的issue可以参照,谢谢
依赖情况(代码类问题务必提供)
运行日志或截图