Omniplex-ai / omniplex

Open-Source Perplexity
https://omniplex.ai
GNU Affero General Public License v3.0
873 stars 165 forks source link

hero

Omniplex

Open-Source Perplexity

Website · Discord · Reddit

:construction: Under Active Development

Our focus is on establishing core functionality and essential features. As we continue to develop Omniplex, we are committed to implementing best practices, refining the codebase, and introducing new features to enhance the user experience.

Get started

To run the project, modify the code in the Chat component to use the // Development Code.

  1. Fork & Clone the repository
git clone git@github.com:[YOUR_GITHUB_ACCOUNT]/omniplex.git
  1. Install the dependencies
yarn
  1. Fill out secrets in .env.local
BING_API_KEY=
OPENAI_API_KEY=

OPENWEATHERMAP_API_KEY=
ALPHA_VANTAGE_API_KEY=
FINNHUB_API_KEY=
  1. Run the development server
yarn dev
  1. Open http://localhost:3000 in your browser to see the app.

Plugins Development

This is just a hacky way but very easy to implement. We will be adding a more robust way to add plugins in the future. Feel free to understand from the sample plugin we have added.

  1. Update the types in types.ts to include the new plugin data types.
  2. Update the tools api in api to include the new plugin function call.
  3. Update the api.ts in utils file to include the new plugin data.
  4. Update the chatSlice.ts in store to include the new plugin reducer.
  5. Create a new folder in the components directory for the UI of the plugin.
  6. Update the chat.tsx to handle the new plugin in useEffect.
  7. Call the plugin function and return the data as props to source.
  8. Update the source.ts to use the plugin UI.
  9. Lastly Update the data.ts in utils to show in the plugin tab.

Multi-LLM Support: Example

  1. Add the new LLM apiKey in env and add the related npm package.
ANTHROPIC_API_KEY=******
  1. Update the chat in api
import Anthropic from "@anthropic-ai/sdk";
import { OpenAIStream, StreamingTextResponse } from "ai";

const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});

export const runtime = "edge";

export async function POST(req: Request) {
  const {
    messages,
    model,
    temperature,
    max_tokens,
    top_p,
    frequency_penalty,
    presence_penalty,
  } = await req.json();

  const response = await anthropic.messages.create({
    stream: true,
    model: model,
    temperature: temperature,
    max_tokens: max_tokens,
    top_p: top_p,
    frequency_penalty: frequency_penalty,
    presence_penalty: presence_penalty,
    messages: messages,
  });

  const stream = OpenAIStream(response);
  return new StreamingTextResponse(stream);
}
  1. Update the data in utils
export const MODELS = [
  { label: "Claude 3 Haiku", value: "claude-3-haiku-20240307" },
  { label: "Claude 3 Sonnet", value: "claude-3-sonnet-20240229" },
  { label: "Claude 3 Opus", value: "claude-3-opus-20240229" },
];

Disclaimer

We recently transitioned from the pages directory to the app directory, which involved significant changes to the project structure and architecture. As a result, you may encounter some inconsistencies or rough edges in the codebase.

Roadmap

App Architecture

Services

Contributing

We welcome contributions from the community! If you'd like to contribute to Openpanel, please follow these steps:

  1. Fork the repository
  2. Create a new branch for your feature or bug fix
  3. Make your changes and commit them with descriptive messages
  4. Push your changes to your forked repository
  5. Submit a pull request to the main repository

Please ensure that your code follows our coding conventions and passes all tests before submitting a pull request.

License

This project is licensed under the AGPL-3.0 license.

Contact

If you have any questions or suggestions, feel free to reach out to us at Contact.

Happy coding! 🚀