Chat with Ollama leverages the Ollama API to provide an interactive chatbot experience. The project is built with PHP and integrates seamlessly with the Ollama API to deliver a robust and flexible chatbot solution.
Clone the repository:
git clone https://github.com/LebToki/chat-with-ollama.git
cd chat-with-ollama
Install PHP dependencies:
composer install
Install JavaScript dependencies:
npm install
Configure the environment variables in config.php
:
'ollamaApiUrl' => 'http://localhost:11434/api/',
'jwtToken' => 'YOUR_JWT_TOKEN_HERE',
Start the PHP built-in server:
php -S localhost:8000
Open your browser and navigate to http://localhost:8000
unless you have a local stack then simply serve it on your own port
http://localhost:8000
Interact with the chatbot by typing a message in the input box and clicking the send button.
The config.php
file contains the settings required to connect to the Ollama API. Ensure you have the correct API URL and JWT token set up:
return [
'ollamaApiUrl' => 'http://localhost:11434/api/',
'jwtToken' => 'YOUR_JWT_TOKEN_HERE',
];
Select your default model for the session from the header or set a default one in your settings page.
This allows you to customize your journey as different models react differently at times.
Once I get the file uploads to work, you'll be able to use it to chat with your files.
Attention developers and chatbot enthusiasts! Are you ready to enhance your development experience with an intuitive chatbot interface? Look no further than our customized user interface designed specifically for Chat with Ollama.
🚀 Pros & Devs love Ollama and for sure will love our Chat with Ollama as the combination of these two makes it unbeatable!
Our UI automatically connects to the Ollama API, making it easy to manage your chat interactions. Plus, we've included an automated model selection feature for popular models like llama2 and llama3. We've gone the extra mile to provide a visually appealing and intuitive interface that's easy to navigate, so you can spend more time coding and less time configuring. And with a responsive design, you can access our UI on any device.
Clone the repository and set up your project by following the instructions in the setup guide. Ensure your Ollama API URL and JWT token are configured correctly in the config.php file. Use the fetch_models.php script to fetch the available models from the Ollama API and update the model list.
// config.php
return [
'ollamaApiUrl' => 'http://localhost:11434/api/',
'jwtToken' => 'YOUR_JWT_TOKEN'
];
Run the fetch_models.php script to update the models list.
php fetch_models.php
Start interacting with the chatbot through the UI.
What's New in 1.0.0 · June 2, 2024 (major release with foundational features)
Introduced Dark Mode Support:
Implemented dark mode for a better user experience during night-time usage.
Enhanced Model Selection:
Automated fetching and selection of models from the Ollama API.
Improved UI for model selection and configuration.
📢️ Thanks everyone for your support and words of love for Chat with Ollama, I am committed to creating the best Chatbot Interface to support the ever-growing community.
For full details and former releases, check out the changelog.
Whether you're a developer, system integrator, or enterprise user, you can trust that we did everything possible to make it as smooth and easy as 1,2,3 to set up Chat with Ollama.
⭐ Give us a star on GitHub 👆
⭐ Fork the project on GitHub and contribute👆
🚀 Do you like to code? You're more than welcome to contribute Join the Discussions!
💡 Got a feature suggestion? Add your roadmap ideas
This project is licensed under the Attribution License.
This work by Tarek Tarabichi is licensed under CC BY 4.0
2023-2024 · Tarek Tarabichi from 2TInteractive.com · Made with 💙