NLPJCL / RAG-Retrieval

Unify Efficient Fine-tuning of RAG Retrieval, including Embedding, ColBERT,Cross Encoder
MIT License
441 stars 38 forks source link

训练bge-m3和bce-embedding-base_v1报错 #11

Closed njxisang closed 4 months ago

njxisang commented 4 months ago

使用train_embedding.sh训练上述两个模型报错: Traceback (most recent call last): File "train_embedding.py", line 182, in main() File "train_embedding.py", line 109, in main model = accelerator.prepare(model) File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1292, in prepare result = tuple( File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1293, in self._prepare_one(obj, first_pass=True, device_placement=d) for obj, d in zip(args, device_placement) File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1169, in _prepare_one return self.prepare_model(obj, device_placement=device_placement) File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1443, in prepare_model self.state.fsdp_plugin.set_auto_wrap_policy(model) File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/utils/dataclasses.py", line 1182, in set_auto_wrap_policy raise Exception("Could not find the transformer layer class to wrap in the model.") Exception: Could not find the transformer layer class to wrap in the model.

NLPJCL commented 4 months ago

参考:https://github.com/NLPJCL/RAG-Retrieval/issues/5

njxisang commented 4 months ago

谢谢,解决了!