TransformerOptimus / SuperAGI

<⚡️> SuperAGI - A dev-first open source autonomous AI agent framework. Enabling developers to build, manage & run useful autonomous agents quickly and reliably.
https://superagi.com/
MIT License
15.34k stars 1.85k forks source link

Feature req: Please integrate apipie.ai #1415

Open EncryptShawn opened 6 months ago

EncryptShawn commented 6 months ago

⚠️ Check for existing issues before proceeding. ⚠️

Where are you using SuperAGI?

Windows

Which branch of SuperAGI are you using?

Main

Do you use OpenAI GPT-3.5 or GPT-4?

GPT-3.5

Which area covers your issue best?

Installation and setup

Describe your issue.

Developers want access to as much AI as they can get, they dont want to manage 50 accounts, they want the fastest AI they want the cheapest AI, and you can provide all of that for them.

in addition to or in place of integrating with any aggregators - Please integrate APIpie so devs can access them all from one place/subscription and plus it also provides:

-The most affordable, reliable and fastest AI available -One API to access ~500 Models and growing -Language, embedding, voice, image, vision and more -Global AI load balancing, route queries based on price or latency -Redundancy for major models providing the greatest up time possible -Global reporting of AI availability, pricing and performance

Its the same API format as openai, just change the domain name and your API key and enjoy a plethora of models without changing any of your code other than how you handle the models list.

This is a win win for everyone, any new AI's from any providers will be automatically integrated into your stack with this one integration. Not to mention all the other advantages.

How to replicate your Issue?

apipie.ai does not appear to be integrated

Upload Error Log Content

apipie.ai

AFKler commented 5 months ago

https://apipie.ai/docs/api/chat

you can easily set it up yourself. Change OPENAI_API_BASE in your config.yaml like this

# For locally hosted LLMs comment out the next line and uncomment the one after
# to configure a local llm point your browser to 127.0.0.1:7860 and click on the model tab in text generation web ui.
OPENAI_API_BASE: https://apipie.ai/v1

now run docker compose up --build again. go to settings, paste your API Key under OpenAI and hit save. your desired model might be retrieved from apipie automatically. if not add it with the correct name.