Closed jjaychen1e closed 8 months ago
I'm currently also stuck with this issue. It's possible that the data structure returned by https://backend.raycast.com/api/v1/ai/models
is affecting it, but I'm not exactly sure what the API response should be. Perhaps we might need to subscribe to Pro and then use MITM to capture the data source for further testing. 😕🔍
I'm currently also stuck with this issue. It's possible that the data structure returned by
https://backend.raycast.com/api/v1/ai/models
is affecting it, but I'm not exactly sure what the API response should be. Perhaps we might need to subscribe to Pro and then use MITM to capture the data source for further testing. 😕🔍
I use the free trial to capture the response and find we miss the values in the features
field. After adding these values it works fine.
Here is the response from the official https://backend.raycast.com/api/v1/ai/models
:
{
"models": [
{
"id": "openai-gpt-3.5-turbo-instruct",
"model": "gpt-3.5-turbo-instruct",
"name": "GPT-3.5 Turbo Instruct",
"features": [
"commands",
"api"
],
"status": null,
"requires_better_ai": false,
"provider": "openai",
"provider_name": "OpenAI"
},
{
"id": "openai-gpt-3.5-turbo",
"model": "gpt-3.5-turbo",
"name": "GPT-3.5 Turbo",
"features": [
"chat",
"quick_ai",
"commands",
"api"
],
"status": null,
"requires_better_ai": false,
"provider": "openai",
"provider_name": "OpenAI"
},
{
"id": "openai-gpt-4",
"model": "gpt-4",
"name": "GPT-4",
"features": [
"chat",
"quick_ai",
"commands",
"api"
],
"status": "beta",
"requires_better_ai": true,
"provider": "openai",
"provider_name": "OpenAI"
}
],
"default_models": {
"chat": "openai-gpt-3.5-turbo",
"quick_ai": "openai-gpt-3.5-turbo",
"commands": "openai-gpt-3.5-turbo-instruct",
"api": "openai-gpt-3.5-turbo-instruct"
}
}
I've created a PR: https://github.com/yufeikang/raycast_api_proxy/pull/17
Nothing happened after clicking "Create AI Command". Not sure is it related to the model list, I notice there is a model field on the official Recast AI tutorial in the same page.