使用train_embedding.sh训练上述两个模型报错:
Traceback (most recent call last):
File "train_embedding.py", line 182, in
main()
File "train_embedding.py", line 109, in main
model = accelerator.prepare(model)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1292, in prepare
result = tuple(
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1293, in
self._prepare_one(obj, first_pass=True, device_placement=d) for obj, d in zip(args, device_placement)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1169, in _prepare_one
return self.prepare_model(obj, device_placement=device_placement)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1443, in prepare_model
self.state.fsdp_plugin.set_auto_wrap_policy(model)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/utils/dataclasses.py", line 1182, in set_auto_wrap_policy
raise Exception("Could not find the transformer layer class to wrap in the model.")
Exception: Could not find the transformer layer class to wrap in the model.
使用train_embedding.sh训练上述两个模型报错: Traceback (most recent call last): File "train_embedding.py", line 182, in
main()
File "train_embedding.py", line 109, in main
model = accelerator.prepare(model)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1292, in prepare
result = tuple(
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1293, in
self._prepare_one(obj, first_pass=True, device_placement=d) for obj, d in zip(args, device_placement)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1169, in _prepare_one
return self.prepare_model(obj, device_placement=device_placement)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/accelerator.py", line 1443, in prepare_model
self.state.fsdp_plugin.set_auto_wrap_policy(model)
File "/home/nlp/miniconda3/envs/rag-retrieval/lib/python3.8/site-packages/accelerate/utils/dataclasses.py", line 1182, in set_auto_wrap_policy
raise Exception("Could not find the transformer layer class to wrap in the model.")
Exception: Could not find the transformer layer class to wrap in the model.