baidubce / bce-qianfan-sdk

Provide best practices for LMOps, as well as elegant and convenient access to the features of the Qianfan MaaS Platform. (提供大模型工具链最佳实践,以及优雅且便捷地访问千帆大模型平台)
https://cloud.baidu.com/doc/WENXINWORKSHOP/index.html
Apache License 2.0
319 stars 48 forks source link

The provided model 'Meta-Llama-3-70b-Instruct' is not in the list of supported models #478

Closed chxb closed 5 months ago

chxb commented 5 months ago

System Info

qianfan version: 0.3.9

Reproduction

Traceback (most recent call last): File "xxx/python3.9/site-packages/langchain_core/language_models/llms.py", line 246, in invoke self.generate_prompt( File "xxx/python3.9/site-packages/langchain_core/language_models/llms.py", line 541, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks, kwargs) File "xxx/python3.9/site-packages/langchain_core/language_models/llms.py", line 714, in generate output = self._generate_helper( File "xxx/python3.9/site-packages/langchain_core/language_models/llms.py", line 578, in _generate_helper raise e File "xxx/python3.9/site-packages/langchain_core/language_models/llms.py", line 565, in _generate_helper self._generate( File "xxx/python3.9/site-packages/langchain_core/language_models/llms.py", line 1153, in _generate self._call(prompt, stop=stop, run_manager=run_manager, kwargs) File "xxx/python3.9/site-packages/langchain_community/llms/baidu_qianfan_endpoint.py", line 183, in _call response_payload = self.client.do(**params) File "xxx/python3.9/site-packages/qianfan/resources/llm/completion.py", line 206, in do return self._do( File "xxx/python3.9/site-packages/qianfan/resources/llm/base.py", line 245, in _do model, endpoint = self._update_model_and_endpoint(model, endpoint) File "xxx/python3.9/site-packages/qianfan/resources/llm/base.py", line 179, in _update_model_and_endpoint model_info = self.get_model_info(model_name) File "xxx/python3.9/site-packages/qianfan/resources/llm/base.py", line 460, in get_model_info raise errors.InvalidArgumentError( qianfan.errors.InvalidArgumentError: The provided model Meta-Llama-3-70B-Instruct is not in the list of supported models. If this is a recently added model, try using the endpoint arguments and create an issue to tell us. Supported models: {'ERNIE-Speed-8K', 'ERNIE-3.5-8K-0205', 'ERNIE-3.5-8K-1222', 'AquilaChat-7B', 'Mixtral-8x7B-Instruct', 'ERNIE-Bot-turbo', 'ChatGLM2-6B-32K', 'ERNIE Speed', 'EB-turbo-AppBuilder', 'BLOOMZ-7B', 'ERNIE-Speed-128K', 'ERNIE-Lite-8K-0922', 'ERNIE 3.5', 'ERNIE-Lite-8K-0308', 'ChatLaw', 'ERNIE-3.5-4K-0205', 'ERNIE-3.5-8K', 'Llama-2-7B-Chat', 'Yi-34B-Chat', 'ERNIE-Bot-turbo-AI', 'Llama-2-13B-Chat', 'XuanYuan-70B-Chat-4bit', 'ERNIE-4.0-8K', 'ERNIE-Speed', 'Qianfan-BLOOMZ-7B-compressed', 'Llama-2-70B-Chat', 'ERNIE-Bot', 'ERNIE-Bot-4', 'Qianfan-Chinese-Llama-2-13B', 'CodeLlama-7b-Instruct', 'Qianfan-Chinese-Llama-2-7B', 'ERNIE-Bot-8K', 'SQLCoder-7B', 'ERNIE Speed-AppBuilder'}

ZingLix commented 5 months ago

Try using endpoint = "llama_3_70b" as a workaround.

Alternatively, you can authenticate using access key, and the newest model list will be automatically refreshed. Then, model="Meta-Llama-3-70B" should work seamlessly.