Closed Gitoffomalawn closed 4 months ago
That's an unfortunate glitch in the install. I think you're at the ollama prompt.
when you're at the >>>
just type /bye
Does this help?
Typing /bye
does quit the Ollama interface, but dumps me back in the Linux shell. So, still no web interface.
Try starting the web interface with streamlit run Concierge.py
Try starting the web interface with
streamlit run Concierge.py
Ok, that worked.
However, I think there's still something wrong. Having Concierge running, I can now ingest files into the model. When I do so and then ask it a question, I only get this error, no matter how much data was ingested: "No sources were found matching your query. Please refine your request to closer match the data in the database or ingest more data."
I created a test dataset using the NIST CSF framework, but I just can;t get Concierge to out put anything other than that message. I'm following the steps on the Concierge main page to the letter.
To clarify, these are the steps I'm following:
Image 1
Image 2
Image 3
Anyone have any ideas as to why this isn't working?
0.3.0 uses a different set of dependencies, please remove your currently installed elements and try following the new instructions. If you're still running into difficulties please open a new issue.
Is this supposed to have a web interface?
The documentation seems to suggest it does (refers to pressing a button), and the demo of this I recently saw also featured a web interface. However, having gone through the steps detailed for installation, I can only communicate with the project through command line. There seems to be no option to upload any files, or make it ingest any information other than what it already contains.
Am I missing something here?