dustinblackman / oatmeal

Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors!
https://dustinblackman.com/posts/oatmeal/
MIT License
477 stars 23 forks source link

Support perplexity #36

Closed aemonge closed 5 months ago

aemonge commented 7 months ago

oatmeal --open-ai-token=$(cat ~/.ssh/perplexity-token ) --open-ai-url="https://api.perplexity.ai" --backend=openai

╭Oatmeal───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮                                                        
│ Hey, it looks like backend openai isn't running, I can't connect to it. You should double check that before we start talking, otherwise I may crash. │                                                           ▲
│                                                                                                                                                      │                                                           █
│ Error: OpenAI health check failed                                                                                                                    │                                                           █
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯                                                           █
dustinblackman commented 7 months ago

First time I'm hearing about Perplexity! Do they claim to integrate an OpenAI compatible API? Looking at their docs it's very empty, and doesn't look like they support response streaming.

aemonge commented 7 months ago

Yes, it's very knew to the market. But so far it has greater experience than GPT-4, since you can easily change from one to another LLM Backend.

I'm unaware of such claim, as you mentioned the docs are pretty simplistic.

aislasq commented 5 months ago

This is all the documentation I could find. It looks like it is compatible with the OpenAI API, incluiding streaming (I haven't tested it yet) https://docs.perplexity.ai/reference/post_chat_completions

It looks like it fails because because of two reasons:

The draft PR I have open for Github Copilot Chat shares pretty much the same code for OpenAI (and Perplexity by the looks of it) on the get_completions method; just the headers change (which are very important). It could be be appropriate to refactor this code into a single openai_compatible_completions method of sorts and each backend call this with their respective URL, Headers and data.

dustinblackman commented 5 months ago

It could be be appropriate to refactor this code into a single openai_compatible_completions method of sorts and each backend call this with their respective URL, Headers and data.

This is has been my problem as I've looked at more OpenAI proxies and compatible API's. They claim to be compatible, but there's little nuances that are enough of a pain that attempting to create a helper function will eventually just blow up in to being a set of complex condition to setup the request. I'm more open to if they are similar, copy/paste the OpenAI backend infrastructure file, rename functions appropriately, and make the changes to work with the new API.

If you know of anyone at Perplexity who'd be open to providing a free account for a short period of time, I'd be happy to implement this. Otherwise I love PRs! :D

aemonge commented 5 months ago

Ohh, I see.

Claiming and doing so, that's common in the industry, hehe.

I'll keep my ears open if I can find anyone from perplexity, but I'll focus on a PR when I get some extra time. Thanks!