IDinsight / aaq-core

No-code, easy-to-setup, reliable content manager and RAG plugin for chatbots in social sector
https://ask-a-question.com
BSD 3-Clause "New" or "Revised" License
21 stars 4 forks source link
faq faq-chatbot faq-system fastapi guardrails llms mkdocs-material rag react

logo logo

Developer Docs | Features | Usage | Architecture | Funders and Partners

Ask A Question is a free and open-source tool created to help non-profit organizations, governments in developing nations, and social sector organizations use Large Language Models for responding to citizen inquiries in their native languages.

:woman_cartwheeling: Features

:question: LLM-powered search

Match your questions to content in the database using embeddings from LLMs.

:robot: LLM responses

Craft a custom reponse to the question using LLM chat and the content in your database

:electric_plug: Integrate with your own chatbot

Connect to your own chatbot on platforms like Turn.io, Glific, and Typebot using our APIs.

:books: Manage content

Use the AAQ App to add, edit, and delete content in the database (Sign up for a demo here)

:rotating_light: Message Triaging

Identify urgent or important messages based on your own criteria.

:office_worker: Content manager dashboard

See which content is the most sought after, the kinds of questions that receive poor feedback, identify missing content, and more

:construction: Upcoming

:speech_balloon: Conversation capability

Refine or clarify your question through conversation

:video_camera: Multimedia content

Respond with not just text but voice, images, and videos as well.

:technologist: Engineering dashboard

Monitor uptime, response rates, throughput HTTP reponse codes and more

[!NOTE] Looking for other features? Please raise an issue with [FEATURE REQUEST] before the title.

Usage

There are two major endpoints for Question-Answering:

See docs or API docs for more details and other API endpoints.

:question: Embeddings search

curl -X 'POST' \
  'https://[DOMAIN]/api/embeddings-search' \
  -H 'accept: application/json' \
  -H 'Authorization: Bearer <BEARER TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
  "query_text": "how are you?",
  "query_metadata": {}
}'

:robot: LLM response

curl -X 'POST' \
  'https://[DOMAIN]/api/llm-response' \
  -H 'accept: application/json' \
  -H 'Authorization: Bearer <BEARER TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
  "query_text": "this is my question",
  "query_metadata": {}
}'

:books: Manage content

You can access the admin console at

https://[DOMAIN]/

Architecture

We use docker-compose to orchestrate containers with a reverse proxy that manages all incoming traffic to the service. The database and LiteLLM proxy are only accessed by the core app.

Flow

Documentation

See here for full documentation.

Funders and Partners

google_dot_org