Closed karstegg closed 10 months ago
Can you provide some examples?
I guess this is to be expected, although not ideal. The prompts attempt to guide the LLM away from hallucinations but I don't know that we'll ever be able to get away from them entirely.
Perhaps once we get full-on LangChain support (using Docker) things might get better
When I query something using chat, it often provides other generic info