developersdigest / llm-answer-engine

Build a Perplexity-Inspired Answer Engine Using Next.js, Groq, Llama-3, Langchain, OpenAI, Upstash, Brave & Serper
https://developersdigest.tech
MIT License
4.29k stars 681 forks source link

ollama interface problem #11

Open iplayfast opened 3 months ago

iplayfast commented 3 months ago
 ✓ Ready in 667ms
 ○ Compiling / ...
 ✓ Compiled / in 1628ms (1571 modules)
 ✓ Compiled in 293ms (470 modules)
There was a problem with your fetch operation: [Error: Network response was not ok. Status: 422]
Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
 ⨯ unhandledRejection: Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
 ⨯ unhandledRejection: Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
Error fetching search results: [Error: HTTP error! status: 422]
Error fetching videos: [Error: Network response was not ok. Status: 403]
The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call `.done()`.

app/config.tsx

// - The below are going to be the default values, eventually this will move to a UI component so it can be easily changed by the user
// - To enable + use Ollama models, ensure inference and/or embeddings model are downloaded and ollama is running https://ollama.com/library 
// - Icons within UI are not yet dynamic, to change currently, you must change the img src path in the UI component
// - IMPORTANT: when Ollama Embeddings + Ollama inference enabled at the same time, this can cause time-to-first-token to be quite long
// - IMPORTANT: Follow-up questions are not yet implrmented with Ollama models, only OpenAI compatible models that use  {type: "json_object"}

export const config = {
    useOllamaInference: true, 
    useOllamaEmbeddings: true, 
    inferenceModel: 'mistral', //mixtral-8x7b-32768', // Groq: 'mixtral-8x7b-32768', 'gemma-7b-it' // OpenAI: 'gpt-3.5-turbo', 'gpt-4' // Ollama 'mistral', 'lla
ma2' etc
    inferenceAPIKey: process.env.GROQ_API_KEY, // Groq: process.env.GROQ_API_KEY // OpenAI: process.env.OPENAI_API_KEY // Ollama: 'ollama' is the default
    embeddingsModel: 'llama2', // Ollama: 'llama2', 'nomic-embed-text' // OpenAI 'text-embedding-3-small', 'text-embedding-3-large'
    textChunkSize: 500, // Recommended to decrease for Ollama
    textChunkOverlap: 200, // Recommended to decrease for Ollama
    numberOfSimilarityResults: 4, // Numbher of similarity results to return per page
    numberOfPagesToScan: 5, // Recommended to decrease for Ollama
    nonOllamaBaseURL: 'https://api.groq.com/openai/v1', //Groq: https://api.groq.com/openai/v1 // OpenAI: https://api.openai.com/v1 
};
noodleA1 commented 3 months ago

I'm getting something a little similar:

(base) arw@AIMonster:/mnt/e/AI_Directories/llm-answer-engine$ npm run dev

ai-rsc-demo@0.0.0 dev next dev

▲ Next.js 14.1.2

app/config is the default, and I have all api keys (brave, openai, groq, serper)

noodleA1 commented 3 months ago

running a different search, this came up with a few different errors. yes...yes, I searched up kiwis

▲ Next.js 14.1.2

noodleA1 commented 3 months ago

Another attempt after uninstalling node/npm ... (base) arw@AIMonster:/mnt/e/AI_Directories/llm-answer-engine$ npm run dev

ai-rsc-demo@0.0.0 dev next dev

▲ Next.js 14.1.2