SatwantKumar / lanarky

FastAPI framework to build production-grade LLM applications
https://lanarky.readthedocs.io/en/latest/
MIT License
0 stars 0 forks source link
lanarky-logo [![stars](https://img.shields.io/github/stars/ajndkr/lanarky)](https://github.com/ajndkr/lanarky/stargazers) [![Documentation](https://img.shields.io/badge/documentation-ReadTheDocs-blue.svg)](https://lanarky.readthedocs.io/en/latest/) [![Code Coverage](https://coveralls.io/repos/github/ajndkr/lanarky/badge.svg?branch=main)](https://coveralls.io/github/ajndkr/lanarky?branch=main) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/ajndkr/lanarky/blob/main/LICENSE) [![Twitter](https://img.shields.io/twitter/follow/lanarky_io?style=social)](https://twitter.com/intent/follow?screen_name=lanarky_io) [![PyPI version](https://badge.fury.io/py/lanarky.svg)](https://pypi.org/project/lanarky/) [![PyPI stats](https://img.shields.io/pypi/dm/lanarky.svg)](https://pypistats.org/packages/lanarky) [![Supported Python Versions](https://img.shields.io/pypi/pyversions/lanarky.svg)](https://pypi.org/project/lanarky/)

A FastAPI framework to build production-grade LLM applications.

Table of Contents - [πŸš€ Features](#-features) - [❓ Why?](#-why) - [πŸ’Ύ Installation](#-installation) - [πŸ”₯ Build your first LLM app](#-build-your-first-llm-app) - [πŸ“ Roadmap](#-roadmap) - [🀩 Stargazers](#-stargazers) - [🀝 Contributing](#-contributing) - [πŸ“ License](#-license) - [✨ Want to build LLM applications with us?](#-want-to-build-llm-applications-with-us)

πŸš€ Features

See Roadmap for upcoming features.

❓ Why?

Many open-source projects for developing and deploying LLM applications have either opinionated designs, particularly regarding deployment, or limitations in terms of scalability. This is where Lanarky comes in. Lanarky is an open-source project that provides Python users with an unopinionated web framework for constructing and deploying LLM applications. By leveraging FastAPI as its foundation, Lanarky ensures that applications built with it are production-ready and can be seamlessly deployed on any cloud provider.

πŸ’Ύ Installation

The library is available on PyPI and can be installed via pip.

pip install lanarky

You can find the full documentation at https://lanarky.readthedocs.io/en/latest/.

πŸ”₯ Build your first Langchain app

from dotenv import load_dotenv
from fastapi import FastAPI
from langchain import ConversationChain
from langchain.chat_models import ChatOpenAI

from lanarky import LangchainRouter

load_dotenv()
app = FastAPI()

langchain_router = LangchainRouter(
    langchain_url="/chat",
    langchain_object=ConversationChain(
        llm=ChatOpenAI(temperature=0), verbose=True
    ),
    streaming_mode=0
  )
app.include_router(langchain_router)

See examples/ for list of available demo examples.

Create a .env file using .env.sample and add your OpenAI API key to it before running the examples.

demo

πŸ“ Roadmap

🀩 Stargazers

Leave a ⭐ if you find this project useful.

Star History Chart

🀝 Contributing

Code check Publish

Contributions are more than welcome! If you have an idea for a new feature or want to help improve lanarky, please create an issue or submit a pull request on GitHub.

See CONTRIBUTING.md for more information.

Contributors

βš–οΈ License

The library is released under the MIT License.

✨ Want to build LLM applications with us?

Are you interested in building LLM applications with us? We would love to hear from you! Reach out to us on Twitter @lanarky_io.

Let's connect and explore the possibilities of working together to create amazing LLM applications with Lanarky!