promptengineers-ai / llm-server

🤖 Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
https://promptengineersai.netlify.app
32 stars 8 forks source link

FROM feature/127-interpreter-as-tool TO development #128

Open ryaneggz opened 4 months ago

ryaneggz commented 4 months ago

closes #127

vercel[bot] commented 4 months ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment | Name | Status | Preview | Comments | Updated (UTC) | | :--- | :----- | :------ | :------- | :------ | | **llm-server** | ⬜️ Ignored ([Inspect](https://vercel.com/promptengineers-ai/llm-server/GhMykFaXRxgycmutFseFeCY2ZMyd)) | [Visit Preview](https://llm-server-git-feature-127-interprete-665768-promptengineers-ai.vercel.app) | | Jul 6, 2024 2:22am |
ryaneggz commented 1 day ago

This pull request introduces several changes to the project, primarily focused on adding a new Code Interpreter service and preparing for potential future additions. Here are the key points:

  1. Code Interpreter Service:

    • Added a new Docker service called "interpreter" using a custom image (promptengineers/interpreter:latest).
    • This service runs both Jupyter Notebook (port 8888) and FastAPI (port 8000).
    • It's designed to function as a tool, likely for executing code within the LLM server environment.
  2. Database Changes:

    • Updated the MySQL service configuration.
    • Introduced an initialization script (init_db.sql) to set up databases and users.
    • Created separate databases for the main application (llm_server) and potentially for n8n.
  3. n8n Integration Preparation:

    • Added (but commented out) configuration for an n8n service, suggesting plans for future workflow automation integration.
  4. Version Control and Gitignore:

    • Updated .gitignore to exclude new directories related to the interpreter and n8n.
  5. Documentation:

    • Updated the Changelog.md to mention the new "interpreter-as-tool" feature.
  6. Docker Compose Updates:

    • Modified the docker-compose.yml to include the new interpreter service and update existing services.

Overall, this PR seems to be laying the groundwork for expanding the project's capabilities, particularly in terms of code execution and potentially workflow automation. The addition of the Code Interpreter service is the most significant change, likely enabling the LLM server to execute code as part of its operations.