Open ggbetz opened 3 months ago
I think our transformers version might be out of date: see here: https://github.com/huggingface/text-generation-inference/issues/1620
2024-05-13T17:16:46.166834523Z 2024-05-13 17:16:46,166 - root - INFO - Loading vLLM model jetmoe/jetmoe-8b
2024-05-13T17:16:48.518010161Z Traceback (most recent call last):
2024-05-13T17:16:48.518059680Z File "/usr/local/bin/cot-eval", line 8, in <module>
2024-05-13T17:16:48.518069710Z sys.exit(main())
2024-05-13T17:16:48.518076080Z File "/workspace/cot-eval/src/cot_eval/__main__.py", line 149, in main
2024-05-13T17:16:48.518202040Z llm = VLLM(
2024-05-13T17:16:48.518229400Z File "/usr/local/lib/python3.10/dist-packages/langchain_core/load/serializable.py", line 120, in __init__
2024-05-13T17:16:48.518319969Z super().__init__(**kwargs)
2024-05-13T17:16:48.518333769Z File "/usr/local/lib/python3.10/dist-packages/pydantic/v1/main.py", line 341, in __init__
2024-05-13T17:16:48.518495708Z raise validation_error
2024-05-13T17:16:48.518561108Z pydantic.v1.error_wrappers.ValidationError: 1 validation error for VLLM
2024-05-13T17:16:48.518565458Z __root__
2024-05-13T17:16:48.518568898Z The checkpoint you are trying to load has model type `jetmoe` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. (type=value_error)```
Check upon issue creation:
Parameters:
ToDos: