InfoSecInnovations / concierge

Repo for Concierge AI dev work
Apache License 2.0
147 stars 28 forks source link

Won't ingest data #33

Open azmatt opened 1 week ago

azmatt commented 1 week ago

I have Concierge installed in an Ubuntu 22.04 VM and while it runs, it won't load any data from the web or locally. It just stays on 0%.

image

cshipleyncom commented 6 days ago

I recommend updating to the development branch and trying again.

azmatt commented 5 days ago

Made a completely fresh Ubuntu 22.04 VM, installed the dev branch with no issues, it runs with no errors, but nothing is listening on port 8000. Looking at lsof -Pi I have several docker containers listening on some higher ports, and occasionally one pops up listening on port 8000, but disappears instantly before I can connect.

cshipleyncom commented 5 days ago

When running the install and launch, did you install_dev.py and launch_dev.py?

cshipleyncom commented 5 days ago

you shouldn't have to make a new VM - just run the proper python scripts

azmatt commented 5 days ago

I'll try that now.

azmatt commented 5 days ago

matt@matt-virtual-machine:~/Desktop/concierge$ sudo python3 launch_dev.py [sudo] password for matt: Checking Docker container status...

Ollama running OpenSearch not found Docker container dependencies don't appear to be running properly. Start docker containers with CPU or GPU? [CPU] or GPU: [+] Running 3/0 ✔ Container ollama Running 0.0s ✔ Container opensearch-dashboards R... 0.0s ✔ Container opensearch-node1 Runnin... 0.0s INFO: Will watch for changes in these directories: ['/home/matt/Desktop/concierge/concierge_shiny'] INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit) INFO: Started reloader process [7218] using WatchFiles Process SpawnProcess-1: Traceback (most recent call last): File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/_subprocess.py", line 80, in subprocess_started target(sockets=sockets) File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/server.py", line 65, in run return asyncio.run(self.serve(sockets=sockets)) File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete return future.result() File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/server.py", line 69, in serve await self._serve(sockets) File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/server.py", line 76, in _serve config.load() File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/config.py", line 434, in load self.loaded_app = import_from_string(self.app) File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/importer.py", line 22, in import_from_string raise exc from None File "/home/matt/Desktop/concierge/lib/python3.10/site-packages/uvicorn/importer.py", line 19, in import_from_string module = importlib.import_module(module_str) File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in _load_unlocked File "", line 883, in exec_module File "", line 241, in _call_with_frames_removed File "/home/matt/Desktop/concierge/concierge_shiny/app.py", line 4, in from collection_management import collection_management_ui, collection_management_server File "/home/matt/Desktop/concierge/concierge_shiny/collection_management.py", line 15, in from ingester import ingester_ui, ingester_server File "/home/matt/Desktop/concierge/concierge_shiny/ingester.py", line 6, in from concierge_backend_lib.loading import load_file File "/home/matt/Desktop/concierge/concierge_backend_lib/loading.py", line 9, in from loaders.text import TextFileLoader File "/home/matt/Desktop/concierge/loaders/text.py", line 3, in from binaryornot.check import is_binary ModuleNotFoundError: No module named 'binaryornot'

sebovzeoueb commented 5 days ago

Oops, looks like I forgot to include a dependency there! I will fix the dev branch.

We have just released v0.3.0 which shouldn't have those issues, maybe you can try removing your current install and following the instructions on the main branch. Please let me know if you're still having trouble.

azmatt commented 4 days ago

No problem. FYI just tried it, first it made me install cargo, which I did, then it gave me this:

Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Downloading ruff-0.0.15.tar.gz (51 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 51.7/51.7 KB 957.3 kB/s eta 0:00:00 Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done ERROR: Package 'script-builder' requires a different Python: 3.10.12 not in '>=3.12' Traceback (most recent call last): File "/home/matt/concierge/dev_installer.py", line 1, in from launch_concierge.concierge_installer.arguments import install_arguments ModuleNotFoundError: No module named 'launch_concierge'

sebovzeoueb commented 4 days ago

A couple of things:

I think cargo is a dependency of ruff, which is only used in the dev version.

azmatt commented 4 days ago

I initially missed the Python requirement, looks like I may be waiting until Ubuntu 24.04 actually works in a VM :) Thank you very much for the help!

sebovzeoueb commented 4 days ago

I've downgraded the requirements of the installer packages because I think they should probably work with Python 3.10 even though we're on a later version for development.

Could you try:

pip install launch-concierge --upgrade

python -m launch_concierge.install

and let me know if this works?

Thanks.