FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
30.3k stars 15.65k forks source link

[FEATURE] Perplexity and Groq LLM support #1860

Open tiangao88 opened 7 months ago

tiangao88 commented 7 months ago

Please add support for two new LLM providers

HenryHengZJ commented 7 months ago

Groq PR

HenryHengZJ commented 7 months ago

Welcome any contribution to Perplexity!

njfio commented 7 months ago

You can try Perplexity now with an OpenAI Custom module.

  1. Add a ChatOpenAI Custom Module.
  2. Setup the Credential with your Perplexity API Key.
  3. Enter a model, ex. 'sonar-medium-online', https://docs.perplexity.ai/docs/rate-limits
  4. Add Additional Parameters, basepath, https://api.perplexity.ai
  5. Requires a Frequency Penalty also, ex .9

Not all Agents/Chains that support OpenAI appear to work fully. But a simple LLM Chain is functional. screenshot example

mikehudson2 commented 5 months ago

Also works (somewhat) with 1 May '24 new models: llama-3-sonar-large-32k-online & llama-3-sonar-small-32k-online

cooldude6000 commented 3 months ago

Is this issue open? Can i take it?

raffareis commented 1 month ago

Is this issue open? Can i take it?

Did you take it? Can I take it? @cooldude6000 @HenryHengZJ