Closed gordonmurray closed 1 year ago
same issue with me.
same here, everything installs and run fine but it doesn't work. Also, I tried adding websites as sources, but only one or two pages are scanned. Again, I am unable to delete sources once I added them.
same issue with me. 0 pages or files scanned even status is 'Completed'.
I had to npm install langchain @types/node then make install I was able to get the LLM to run, but luck with the context.
Yea guys my bad, yesterday shipped a change that broke PDF, websites ingesting.
Just merged a fix, so if you can git pull and run make install
again, just verified it locally and looks good; I think writing some tests could prevent that, so will spend some time on putting the tests foundation.
One remark, codebase ingesting is still brand new, especially with recursive crawl for the git repo, so I would suggest to give it at least 2 minutes to fully ingest, I am working on real time reporting (to tell you exactly what is the status of a repo ingesting and if it failed or not)
unfortunately it still doesn't work for me, at least with websites.
@giuliastro Can you share the output of docker compose logs -f
from OpenChat's directory?
Just merged a fix, so if you can git pull and run make install again
A website bot seems to work now. I added a single page site to the system and it was able to answer some questions well.
I can see indexes being created in Pinecone now too.
PDF upload seems to fail on any PDF over 10 mb or so. When I upload a smaller one, seems to work. Seems to need context a lot, but that could be down to the questions I'm asking.
Overall, seems to be working now, thank you @gharbat
I am using Ubuntu. I have cloned the repo, populated common.env with the details and ran make install.
The UI is running and I have created a number of bots using a website, a PDF and a public Github repo.
For each bot, once it has finished indexing the data, the bot answers that is it sorry but it needs more context. It cant answer anything at all.
I don't see any errors and no vectors created in Pinecone.
The docker logs from the openchat-llm-server pod shows 1 warning but no errors.
Any suggestions what I can do to get some answers from a bot?