sdsc-ordes / kg-llm-interface

Langchain-powered natural language interface to knowledge-graphs.
Apache License 2.0
14 stars 3 forks source link

Docker compose #3

Closed cmdoret closed 1 year ago

cmdoret commented 1 year ago

This PR aims the containerize the KG-LLM chatbot.

It mainly adds:

When running outside a container, the chatbot server assumes that a ChromaDB (vector database) and GraphDB (SPARQL endpoint) are running on localhost.

The docker-compose is currently not in a working state because the chromaDB service is not reachable from the server. It is probably a misconfiguration of the ports and networks on my part.

The docker-compose uses profiles, so that we can choose what components to deploy. By default, it will only deploy the chat-server, to also deploy chroma, one should run:

docker compose --profile db up --build

docker compose will read environment variables from the .env file and inject them into docker-compose.yml automatically. This .env file acts as the single source of truth for hosts and ports. A template version is shipped as .env.example

In a subsequent PR, we'll want to add the front-end, and a triple-store (SPARQL endpoint) to the docker-compose if we want a fully standalone chatbot. This document will be useful.

cmdoret commented 1 year ago

I am still working on fixing the docker compose port setup, but any contribution is welcome!

cmdoret commented 1 year ago

The env variables were not passed to the chat-server service in docker-compose. The docker-compose setup is now functional, but not usable since the ChromaDB is empty, and the SPARQL endpoint still missing. This will come next.