emirsahin1 / llm-axe

A simple, intuitive toolkit for quickly implementing LLM powered applications.
MIT License
67 stars 14 forks source link
function-calling llama3 llm local-llm ollama pdf-llm

llm-axe

PyPI - Version PyPI - Downloads Static Badge GitHub forks Hits

Static Badge

Goal

llm-axe is meant to be a flexible toolkit that provides simple abstractions for commonly used functions related to LLMs. It's not meant to intrude in your development workflow as other larger frameworks often do.

It has functions for automatic schema generation, pre-made agents with self-tracking chat history and fully customizable agents.

Have feedback/questions? Join the Discord

Installation

pip install llm-axe

Example Snippets

Output

LLAMA


- **Function Calling**

  A function calling LLM can be created with just **3 lines of code**:
<br>
&emsp;&emsp;No need for premade schemas, templates, special prompts, or specialized functions.
```python
prompt = "I have 500 coins, I just got 200 more. How many do I have?"

llm = OllamaChat(model="llama3:instruct")
fc = FunctionCaller(llm, [get_time, get_date, get_location, add, multiply])
result = fc.get_function(prompt)

output: Based on information from the internet, it appears that https://toscrape.com/ is a website dedicated to web scraping.

It provides a sandbox environment for beginners and developers to learn and validate their web scraping technologies...

- **PDF Reader**
```python
llm = OllamaChat(model="llama3:instruct")
files = ["../FileOne.pdf", "../FileTwo.pdf"]
agent = PdfReader(llm)
resp = agent.ask("Summarize these documents for me", files)

output: {'Name': 'Frodo Baggins', 'Email': 'frodo@gmail.com', 'Phone': '555-555-5555', 'Address': 'Bag-End, Hobbiton, The Shire'}

- **Object Detector**
```python
llm = OllamaChat(model="llava:7b")
detector = ObjectDetectorAgent(llm, llm)
resp = detector.detect(images=["../img2.jpg"], objects=["sheep", "chicken", "cat", "dog"])

#{
#  "objects": [
#    { "label": "Sheep", "location": "Field", "description": "White, black spots" },
#    { "label": "Dog", "location": "Barn", "description": "Brown, white spots" }
#  ]
#}

See more complete examples

How to setup llm-axe with your own LLM

Features

Important Notes

The results you get from the agents are highly dependent on the capability of your LLM. An inadequate LLM will not be able to provide results that are usable with llm-axe

Testing in development was done using llama3 8b:instruct 4 bit quant