miroslavpejic85 / mirotalk

🚀 WebRTC - P2P - Simple, Secure, Fast Real-Time Video Conferences Up to 8k and 60fps, compatible with all browsers and platforms.
https://p2p.mirotalk.com
GNU Affero General Public License v3.0
3.12k stars 570 forks source link

Ollama AI Integration #259

Open trymeouteh opened 2 weeks ago

trymeouteh commented 2 weeks ago

Feature request

The option to use Ollama instead or proprietary AI services like OpenAI.

The hosting provider can provider an Ollama API to use or the host of the room can choose to enable Ollama and set the API URL to their own Ollama server for the host and any other user in the room to use.

Pros

AI Privacy Many LLMs to choose from Free

Additional context

https://ollama.com/

miroslavpejic85 commented 19 hours ago

Thank you for the feature suggestion to replace ChatGPT with Ollama.

Pros of Implementing Ollama:

  1. Privacy: Ollama can be self-hosted, giving more control over user data and avoiding third-party data collection, which enhances privacy.
  2. Flexibility: Ollama supports multiple LLMs, allowing customization for specific use cases, and offers more options for AI model selection.
  3. Cost-Effective: Ollama’s free API provides a budget-friendly alternative to proprietary services like OpenAI, which can reduce operational costs.
  4. Control: The host or room administrator can enable Ollama and set the API URL to their own server, providing full control over the AI environment.

Cons of Implementing Ollama:

  1. Integration Complexity: Setting up and maintaining Ollama, particularly if self-hosted, may increase system complexity and require ongoing management.
  2. Resource Management: AI processing could put additional load on the server, and latency may increase, especially if the AI service is resource-intensive.
  3. Limited Commercial Support: Unlike ChatGPT, Ollama may not offer the same level of enterprise support, which could be an issue for mission-critical applications.
  4. Scalability: Self-hosting Ollama might create challenges around scalability, especially with a high number of users in rooms or large-scale AI usage.

Why We Want to Keep ChatGPT for Now:

We plan to continue using ChatGPT for the time being for several key reasons:

  1. Advanced Capabilities: ChatGPT, especially with GPT-4, provides state-of-the-art conversational abilities and robust performance across a wide range of tasks.
  2. Reliability: OpenAI’s ChatGPT has been extensively tested and is known for its accuracy and ability to handle complex interactions, ensuring a high-quality user experience.
  3. Commercial Support: OpenAI offers reliable support, documentation, and continuous updates to improve performance and security, which is crucial for our platform’s stability and growth.
  4. Familiarity and Trust: Many users are already familiar with ChatGPT’s capabilities, and keeping it ensures that they have access to a trusted, cutting-edge AI solution.

Conclusion:

While we recognize the benefits of Ollama, especially for privacy and cost-efficiency, we believe keeping ChatGPT for now allows us to maintain a high-quality AI experience backed by reliable commercial support and advanced capabilities. However, we are open to exploring Ollama further in the future, especially as privacy and cost considerations continue to evolve.