pinecone-io / canopy

Retrieval Augmented Generation (RAG) framework and context engine powered by Pinecone
https://www.pinecone.io/
Apache License 2.0
976 stars 121 forks source link

Add support OctoAI LLM and embeddings #301

Closed ptorru closed 8 months ago

ptorru commented 9 months ago

Problem

Currently there is no way to invoke OctoAI endpoints

Solution

Add OctoAI adaptors based on those from Anyscale. Updated README for env vars.

Type of Change

Test Plan

Added unit-test for record-encoder. Can be checked via:

poetry run pytest tests/system/record_encoder/test_octoai_record_encoder.py

Done manual e-2-e test using canopy server/chat.

Use the following .env

PINECONE_API_KEY="..."
OCTOAI_API_KEY="..."
INDEX_NAME="octo"
CANOPY_CONFIG_FILE="src/canopy/config_templates/octoai.yaml"
OPENAI_API_KEY="..."

Then:

poetry run canopy new
poetry run canopy upsert doc.txt
poetry run canopy start

In a separate terminal:

poetry run canopy chat --no-stream