Full LLM REST API with prompts, LLMs, Vector Databases, and Agents
### Setup Docker Services
docker-compose up --build
Before running the server make sure to take a look at cp .example.env .env
see Environment Variables.
### Change into Backend directory
cd backend
### Setup Virtual Env
python3 -m venv .venv
### Activate Virtual Env
source .venv/bin/activate
### Install Runtime & Dev Dependencies
pip install -r requirements.txt -r requirements-dev.txt -c constaints.txt
### Install Runtime Dependencies
pip install -r requirements.txt -c constaints.txt
### Migrate Database Schema
alembic upgrade head
### Seed Database Users
python3 -m src.seeds.users 3
### Run Application on local machine
bash scripts/dev.sh
### Change into Backend directory
cd frontend
### Install node_modules
npm install
### Start Development Server
npm run dev
Variable Name | Example | Description |
---|---|---|
APP_ENV | 'development' | Environment where the application is running |
APP_VERSION | 0.0.1 | Version of the application |
APP_SECRET | this-is-top-secret | Secret key for the application |
APP_WORKERS | 1 | Number of application workers |
APP_ADMIN_EMAIL | admin@example.com | Admin email for the application |
APP_ADMIN_PASS | test1234 | Admin password for the application |
TEST_USER_ID | 0000000000000000000000000 | Test user ID |
DATABASE_URL | mysql+aiomysql://admin:password@localhost:3306/llm_server | URL for the database |
PINECONE_API_KEY | API key for Pinecone services | |
PINECONE_ENV | us-east1-gcp | Pinecone environment configuration |
PINECONE_INDEX | default | Default Pinecone index used |
REDIS_URL | redis://localhost:6379 | URL for the Redis service |
OPENAI_API_KEY | sk-abc123... | Default LLM OpenAI key |
GROQ_API_KEY | API key for accessing GROQ services | |
ANTHROPIC_API_KEY | API key for accessing Anthropic services | |
OLLAMA_BASE_URL | http://localhost:11434 | Base URL for the Ollama service |
SEARX_SEARCH_HOST_URL | http://localhost:8080 | URL for the Searx search service |
MINIO_HOST | localhost:9000 | URL to the Object storage |
BUCKET | my-documents | Name of Minio or S3 bucket |
S3_REGION | us-east-1 | Region where the S3 bucket exists |
ACCESS_KEY_ID | AKIAIOSFODNN7EXAMPLE | IAM User Access Key ID (Optional) |
ACCESS_SECRET_KEY | wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY | Secret IAM Key (Optional) |
Here are the upcoming features I'm (ryaneggleston@promptengineers.ai) is excited to bring to Prompt Engineers AI - LLM Server (More to come):
Create an issue and lets start a discussion if you'd like to see a feature added to the roadmap.
We welcome contributions from the community, from beginners to seasoned developers. Here's how you can contribute:
Fork the repository: Click on the 'Fork' button at the top right corner of the repository page on GitHub.
Clone the forked repository to your local machine: git clone <forked_repo_link>
.
Navigate to the project folder: cd llm-server
.
Create a new branch for your changes: git checkout -b <branch_name>
.
Make your changes in the new branch.
Commit your changes: git commit -am 'Add some feature'
.
Push to the branch: git push origin <branch_name>
.
Open a Pull Request: Go back to your forked repository on GitHub and click on 'Compare & pull request' to create a new pull request.
Please ensure that your code passes all the tests and if possible, add tests for new features. Always write a clear and concise commit message and pull request description.
Feel free to submit issues and enhancement requests. We're always looking for feedback and suggestions.
Ryan Eggleston
- ryaneggleston@promptengineers.ai
This project is open-source, under the MIT License. Feel free to use, modify, and distribute the code as you please.
Happy Prompting! ππ