Closed refeed closed 5 months ago
update your types/openai.ts file like this:
import { OPENAI_API_TYPE } from '../utils/app/const';
export interface OpenAIModel {
id: string;
name: string;
maxLength: number; // maximum length of a message
tokenLimit: number;
}
export enum OpenAIModelID {
GPT_3_5 = 'gpt-3.5-turbo',
GPT_3_5_AZ = 'gpt-35-turbo',
GPT_4 = 'gpt-4',
GPT_4_32K = 'gpt-4-32k',
GPT_4_32K_0613 = 'gpt-4-32k-0613',
GPT_4_128K = 'gpt-4-1106-preview'
}
// in case the `DEFAULT_MODEL` environment variable is not set or set to an unsupported model
export const fallbackModelID = OpenAIModelID.GPT_3_5;
export const OpenAIModels: Record<OpenAIModelID, OpenAIModel> = {
[OpenAIModelID.GPT_3_5]: {
id: OpenAIModelID.GPT_3_5,
name: 'GPT-3.5',
maxLength: 12000,
tokenLimit: 4000,
},
[OpenAIModelID.GPT_3_5_AZ]: {
id: OpenAIModelID.GPT_3_5_AZ,
name: 'GPT-3.5',
maxLength: 12000,
tokenLimit: 4000,
},
[OpenAIModelID.GPT_4]: {
id: OpenAIModelID.GPT_4,
name: 'GPT-4',
maxLength: 24000,
tokenLimit: 8000,
},
[OpenAIModelID.GPT_4_32K]: {
id: OpenAIModelID.GPT_4_32K,
name: 'GPT-4-32K',
maxLength: 96000,
tokenLimit: 32000,
},
[OpenAIModelID.GPT_4_32K_0613]: {
id: OpenAIModelID.GPT_4_32K_0613,
name: 'GPT-4-32K-0613',
maxLength: 96000,
tokenLimit: 32000,
},
[OpenAIModelID.GPT_4_128K] :{
id: OpenAIModelID.GPT_4_128K,
name: 'GPT-4-128K',
maxLength: 96000,
tokenLimit: 128000
}
};
I am running chatbot-ui in docker. Couldn't find openai.ts
there. Is there a way to add additional models?
Thanks for the tip @TheRakeshPurohit
One question, I now see the following:
Should I not see the following 5 models listed (why only three of the five):
GPT_3_5 = 'gpt-3.5-turbo',
GPT_4 = 'gpt-4',
GPT_4_32K = 'gpt-4-32k',
GPT_4_32K_0613 = 'gpt-4-32k-0613',
GPT_4_128K = 'gpt-4-1106-preview'
Should I not see the following 5 models listed (why only three of the five):
Share the screenshot of your openai.ts file. You must paste the entire file code I shared.
Hi @TheRakeshPurohit
I compared the code you shared with my 'openai.ts' file in vscode, there are no differences:
Any further suggestions?
Does this app already make API requests to use the new GPT-4 Turbo? All I can see from the frontend it's still using GPT-3.5 and GPT-4