LebToki / Chat-with-Ollama

A PHP Based Private Ollama ChatBot for Personal use
1 stars 1 forks source link

Chat with Ollama

Chat with Ollama leverages the Ollama API to provide an interactive chatbot experience. The project is built with PHP and integrates seamlessly with the Ollama API to deliver a robust and flexible chatbot solution.

CHAT-WITH-OLLAMA-Promo

Prerequisites

Installation

  1. Clone the repository:

    git clone https://github.com/LebToki/chat-with-ollama.git
    cd chat-with-ollama
  2. Install PHP dependencies:

    composer install
  3. Install JavaScript dependencies:

    npm install
  4. Configure the environment variables in config.php:

    'ollamaApiUrl' => 'http://localhost:11434/api/',
    'jwtToken' => 'YOUR_JWT_TOKEN_HERE',

Usage

  1. Start the PHP built-in server:

    php -S localhost:8000
  2. Open your browser and navigate to http://localhost:8000 unless you have a local stack then simply serve it on your own port

    http://localhost:8000
  3. Interact with the chatbot by typing a message in the input box and clicking the send button.

Configuration

The config.php file contains the settings required to connect to the Ollama API. Ensure you have the correct API URL and JWT token set up:

return [
    'ollamaApiUrl' => 'http://localhost:11434/api/',
    'jwtToken' => 'YOUR_JWT_TOKEN_HERE',
];

Choose Your Default Model

Select your default model for the session from the header or set a default one in your settings page. Settings Page

Settings Page

This allows you to customize your journey as different models react differently at times. image

A bit of styling and we have a winner!

Once I get the file uploads to work, you'll be able to use it to chat with your files. image

Attention developers and chatbot enthusiasts! Are you ready to enhance your development experience with an intuitive chatbot interface? Look no further than our customized user interface designed specifically for Chat with Ollama.

🚀 Pros & Devs love Ollama and for sure will love our Chat with Ollama as the combination of these two makes it unbeatable!


Our UI automatically connects to the Ollama API, making it easy to manage your chat interactions. Plus, we've included an automated model selection feature for popular models like llama2 and llama3. We've gone the extra mile to provide a visually appealing and intuitive interface that's easy to navigate, so you can spend more time coding and less time configuring. And with a responsive design, you can access our UI on any device.

Key Features


How to use

Clone the repository and set up your project by following the instructions in the setup guide. Ensure your Ollama API URL and JWT token are configured correctly in the config.php file. Use the fetch_models.php script to fetch the available models from the Ollama API and update the model list.

// config.php

return [
    'ollamaApiUrl' => 'http://localhost:11434/api/',
    'jwtToken' => 'YOUR_JWT_TOKEN'
];

Run the fetch_models.php script to update the models list.

php fetch_models.php

Start interacting with the chatbot through the UI.

Feedback

Changelog

What's New in 1.0.0 · June 2, 2024 (major release with foundational features)

Other Updates and Improvements:

📢️ Thanks everyone for your support and words of love for Chat with Ollama, I am committed to creating the best Chatbot Interface to support the ever-growing community.

Initial Release ### Code Organization - Initial setup of the project with organized structure for controllers, models, and views. - Error Handling - Basic error handling for API requests and user inputs. ### Front-end Enhancements - Initial design of the UI with Bootstrap and FontAwesome integration. Responsive design for better accessibility on all devices. ### Performance Considerations - Basic optimizations for faster loading times. ### Accessibility and Usability - Added alt attributes to all images for better accessibility. ### Modern PHP Features - Utilized modern PHP features for better performance and readability.

For full details and former releases, check out the changelog.


Get Involved

Whether you're a developer, system integrator, or enterprise user, you can trust that we did everything possible to make it as smooth and easy as 1,2,3 to set up Chat with Ollama.

⭐ Give us a star on GitHub 👆

⭐ Fork the project on GitHub and contribute👆

🚀 Do you like to code? You're more than welcome to contribute Join the Discussions!

💡 Got a feature suggestion? Add your roadmap ideas


This project is licensed under the Attribution License.

This work by Tarek Tarabichi is licensed under CC BY 4.0

2023-2024 · Tarek Tarabichi from 2TInteractive.com · Made with 💙