Describe the bug
Backend does not start in docker and it is not possible to build it manually if only Ollama is used.
To Reproduce
npm run build
perplexica-backend@1.7.1 build
tsc
src/agents/suggestionGeneratorAgent.ts:50:4 - error TS2352: Conversion of type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>' to type 'ChatOpenAI' may be a mistake because neither type sufficiently overlaps with the other. If this was intentional, convert the expression to 'unknown' first.
Type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>' is missing the following properties from type 'ChatOpenAI': temperature, topP, frequencyPenalty, presencePenalty, and 11 more.
50 (llm as ChatOpenAI).temperature = 0;
src/websocket/connectionManager.ts:52:7 - error TS2739: Type 'ChatOpenAI<ChatOpenAICallOptions>' is missing the following properties from type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>': getLsParams, _streamEventsV2, _streamEventsV1
52 llm = new ChatOpenAI({
Describe the bug Backend does not start in docker and it is not possible to build it manually if only Ollama is used.
To Reproduce
npm run build
src/agents/suggestionGeneratorAgent.ts:50:4 - error TS2352: Conversion of type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>' to type 'ChatOpenAI' may be a mistake because neither type sufficiently overlaps with the other. If this was intentional, convert the expression to 'unknown' first.
Type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>' is missing the following properties from type 'ChatOpenAI': temperature, topP, frequencyPenalty, presencePenalty, and 11 more.
50 (llm as ChatOpenAI).temperature = 0;
Found 2 errors in 2 files.
Errors Files 1 src/agents/suggestionGeneratorAgent.ts:50 1 src/websocket/connectionManager.ts:52
Additional context Can not start the project at all with Ollama only