Prompt Engineering at your fingertips
π Features
- LLM Proxy Access: Seamless access to all the latest LLMs by OpenAI, Anthropic, Google.
- Custom and Local LLM Support: Use custom or local open-source LLMs through Ollama.
- Prompt Playground UI: A user-friendly interface for engineering and fine-tuning your prompts.
- Python SDK: Easily integrate LLMstudio into your existing workflows.
- Monitoring and Logging: Keep track of your usage and performance for all requests.
- LangChain Integration: LLMstudio integrates with your already existing LangChain projects.
- Batch Calling: Send multiple requests at once for improved efficiency.
- Smart Routing and Fallback: Ensure 24/7 availability by routing your requests to trusted LLMs.
- Type Casting (soon): Convert data types as needed for your specific use case.
π Quickstart
Don't forget to check out https://docs.llmstudio.ai page.
Installation
Install the latest version of LLMstudio using pip
. We suggest that you create and activate a new environment using conda
pip install llmstudio
Install bun
if you want to use the UI
curl -fsSL https://bun.sh/install | bash
Create a .env
file at the same path you'll run LLMstudio
OPENAI_API_KEY="sk-api_key"
ANTHROPIC_API_KEY="sk-api_key"
Now you should be able to run LLMstudio using the following command.
llmstudio server --ui
When the --ui
flag is set, you'll be able to access the UI at http://localhost:3000
π Documentation
π¨βπ» Contributing
- Head on to our Contribution Guide to see how you can help LLMstudio.
- Join our Discord to talk with other LLMstudio enthusiasts.
Training
Thank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.