⚠️ Disclaimer: This project is now in maintenance mode. I won't be adding new features or actively maintaining the project as I have moved on to other projects and priorities. While I will address critical bugs and security issues as needed, active development has ceased from my end. I do encourage the community to continue to contribute to the project if they find it useful. Thank you for using lanarky!
Lanarky is a python (3.9+) web framework for developers who want to build microservices using LLMs. Here are some of its key features:
To learn more about lanarky and get started, you can find the full documentation on lanarky.ajndkr.com
The library is available on PyPI and can be installed via pip
:
pip install lanarky
Lanarky provides a powerful abstraction layer to allow developers to build simple LLM microservices in just a few lines of code.
Here's an example to build a simple microservice that uses OpenAI's ChatCompletion
service:
from lanarky import Lanarky
from lanarky.adapters.openai.resources import ChatCompletionResource
from lanarky.adapters.openai.routing import OpenAIAPIRouter
app = Lanarky()
router = OpenAIAPIRouter()
@router.post("/chat")
def chat(stream: bool = True) -> ChatCompletionResource:
system = "You are a sassy assistant"
return ChatCompletionResource(stream=stream, system=system)
app.include_router(router)
Visit Getting Started for the full tutorial on building and testing your first LLM microservice with Lanarky.
Contributions are more than welcome! If you have an idea for a new feature or want to help improve lanarky, please create an issue or submit a pull request on GitHub.
See CONTRIBUTING.md for more information.
See Lanarky Roadmap for the list of new features and future milestones.
The library is released under the MIT License.