neomatrix369 / learning-path-index

A repo with data files, assets and code supporting and powering the Learning Path Index Project
MIT License
15 stars 16 forks source link

Communicate with Ollama server outside the docker container #67

Closed neomatrix369 closed 2 months ago

neomatrix369 commented 2 months ago

Adapting shellscript to pass the OLLAMA_HOST into the docker container to use it inside the python code - to communicate with the Ollama server outside the docker container

Summary by Sourcery

This pull request updates the Docker run script and Python code to enable communication with the Ollama server outside the Docker container by passing the OLLAMA_HOST environment variable.

sourcery-ai[bot] commented 2 months ago

Reviewer's Guide by Sourcery

This pull request adapts the shell script to pass the OLLAMA_HOST environment variable into the Docker container, allowing the Python code to communicate with the Ollama server outside the Docker container. The changes involve modifying the Docker run command and updating the Python code to use the OLLAMA_HOST environment variable.

File-Level Changes

Files Changes
app/llm-poc-variant-01/docker/run-docker-container.sh
app/llm-poc-variant-01/lpiGPT.py
Modified the Docker run command to pass the OLLAMA_HOST environment variable and updated the Python code to use this variable for communicating with the Ollama server outside the Docker container.

Tips - Trigger a new Sourcery review by commenting `@sourcery-ai review` on the pull request. - Continue your discussion with Sourcery by replying directly to review comments. - You can change your review settings at any time by accessing your [dashboard](https://app.sourcery.ai): - Enable or disable the Sourcery-generated pull request summary or reviewer's guide; - Change the review language; - You can always [contact us](mailto:support@sourcery.ai) if you have any questions or feedback.