ajndkr / lanarky

The web framework for building LLM microservices
https://lanarky.ajndkr.com/
MIT License
976 stars 74 forks source link
fastapi llmops microservices python3 web
lanarky-logo-light-mode lanarky-logo-dark-mode

The web framework for building LLM microservices.

[![Stars](https://img.shields.io/github/stars/ajndkr/lanarky)](https://github.com/ajndkr/lanarky/stargazers) [![License](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/ajndkr/lanarky/blob/main/LICENSE) [![Twitter](https://img.shields.io/twitter/follow/LanarkyAPI?style=social)](https://twitter.com/intent/follow?screen_name=LanarkyAPI) [![Python](https://img.shields.io/pypi/pyversions/lanarky.svg)](https://pypi.org/project/lanarky/) [![Coverage](https://coveralls.io/repos/github/ajndkr/lanarky/badge.svg?branch=main)](https://coveralls.io/github/ajndkr/lanarky?branch=main) [![Version](https://badge.fury.io/py/lanarky.svg)](https://pypi.org/project/lanarky/) [![Stats](https://img.shields.io/pypi/dm/lanarky.svg)](https://pypistats.org/packages/lanarky)

⚠️ Disclaimer: This project is now in maintenance mode. I won't be adding new features or actively maintaining the project as I have moved on to other projects and priorities. While I will address critical bugs and security issues as needed, active development has ceased from my end. I do encourage the community to continue to contribute to the project if they find it useful. Thank you for using lanarky!

Lanarky is a python (3.9+) web framework for developers who want to build microservices using LLMs. Here are some of its key features:

To learn more about lanarky and get started, you can find the full documentation on lanarky.ajndkr.com

Installation

The library is available on PyPI and can be installed via pip:

pip install lanarky

Getting Started

Lanarky provides a powerful abstraction layer to allow developers to build simple LLM microservices in just a few lines of code.

Here's an example to build a simple microservice that uses OpenAI's ChatCompletion service:

from lanarky import Lanarky
from lanarky.adapters.openai.resources import ChatCompletionResource
from lanarky.adapters.openai.routing import OpenAIAPIRouter

app = Lanarky()
router = OpenAIAPIRouter()

@router.post("/chat")
def chat(stream: bool = True) -> ChatCompletionResource:
    system = "You are a sassy assistant"
    return ChatCompletionResource(stream=stream, system=system)

app.include_router(router)

Visit Getting Started for the full tutorial on building and testing your first LLM microservice with Lanarky.

Contributing

Code check Publish

Contributions are more than welcome! If you have an idea for a new feature or want to help improve lanarky, please create an issue or submit a pull request on GitHub.

See CONTRIBUTING.md for more information.

See Lanarky Roadmap for the list of new features and future milestones.

License

The library is released under the MIT License.