-
How can I change the llm that the 01 is connecting with? As I've understood by default it's GPT 4 which is very expensive to use as API. How can I change it to GPT 3.5 Turbo which is ten times cheaper…
-
-
Any plan to use GPT-3.5-turbo/GPT-4-turbo API? This can reduce cost and get better results
-
There is no model of this name. My API Key (fresh from OpenAI) is classed as 'invalid'
I am also told that Openai 0.28 is preferable to Openai 1.0.0!!!
I'm stuck!
-
### What happened?
I was doing fabric --listmodels, when local models is empty while I am running ollama:
GPT Models:
gpt-3.5-turbo
gpt-3.5-turbo-0125
gpt-3.5-turbo-0301
gpt-3.5-turbo-0613
gp…
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
I am trying to use gpt-4o as my model for RAG on PDF/txt documents. Could someone plea…
-
I use GPT-4o is running ok.
But when I changed to the local model, I used some error message.
EXCEPTION: 'function' object has no attribute 'name'
![image](https://github.com/onuratakan/gpt-compute…
-
最新版本,这是配置文件
- qwen: #自定义的模型名称
alias: gpt-4,gpt-3.5-turbo,gpt-3.5-turbo-16k # 别名 例如 gpt4,gpt3
enable: true # false true
model_name_or_path: /home/k/Qwen1.5-7B-Chat-GPTQ-I…
-
![image](https://github.com/janhq/cortex-web/assets/64197333/bf1bc3b7-aeac-4cba-a84b-8896fb5ab9a9)
` cortex run gpt-3.5-turbo`
-
Loks like
public static let defaultModel = "gpt-3.5-turbo"
is removed from ChatGPTAPI on line 18 should this be added back?
Was present in build 1.7.0 not present in 2.1.0