ChatGPT, Claude, Perplexity, and Gemini integrations for chat, information retrieval, and text processing tasks, such as paraphrasing, simplifying, or summarizing. With support for third party proxies and local LLMs.
[!NOTE] This is an alpha preview version of the workflow. You can download it here: Ayai · GPT Nexus
↩ to ask a question.
⌘↩ to start a new conversation.
⌥↩ to copy the last answer.
⌃↩ to copy the full conversation.
⇧↩ to stop generating an answer.
Hidden Options
A prompt is the text that you give the model to elicit, or "prompt," a relevant output. A prompt is usually in the form of a question or instructions.
The primary configuration setting determines the service that is used for conversations.
If you want to use a third party proxy, define the correlating host
, path
, API key
, model
, and if required the url scheme
or port
in the environment variables.
The variables are prefixed as alternatives to OpenAI, because Ayai expects the returned stream events and errors to mirror the shape of those returned by the OpenAI API.
If you want to use a local language model, define the correlating url scheme
, host
, port
, path
, and if required the model
in the environment variables to establish a connection to the local HTTP initiated and maintained by the method of your choice.
The variables are prefixed as alternatives to OpenAI, because Ayai expects the returned stream events and errors to mirror the shape of those returned by the OpenAI API.
[^1]: Third party proxies such as OpenRouter, Groq, Fireworks or Together.ai
[^2]: Local HTTP servers can be set up with interfaces such as LM Studio or Ollama