Open Alexisxty opened 1 month ago
You can see in the conf/openai_config.json configuration file that the azure option is set to false. Although we have adapted it for Azure, we are not using it and instead are using the native OpenAI API.
If you want to use a proxy service provider, you can modify lines 95-108 in the cradle/provider/llm/openai.py file to add the client's proxy service provider.
if conf_dict[PROVIDER_SETTING_IS_AZURE]:
key = os.getenv(key_var_name)
endpoint_var_name = conf_dict[PROVIDER_SETTING_BASE_VAR]
endpoint = os.getenv(endpoint_var_name)
self.client = AzureOpenAI(
api_key = key,
api_version = conf_dict[PROVIDER_SETTING_API_VERSION],
azure_endpoint = endpoint
)
else:
key = os.getenv(key_var_name)
self.client = OpenAI(api_key=key)
十分感谢,已经修改了,还没来问题回复
作者团队您好!我在使用cradle的时候,希望尝试使用openai的api,但是我注意到,你们使用的是微软虚拟机和openai进行交互,我们有一个其他的中转api合作服务商,请问有相关的参考指南帮助我修改这个发送逻辑吗?