PowerShell / AIShell

An interactive shell to work with AI-powered assistance providers
MIT License
68 stars 19 forks source link

Implement settings file for Ollama agent #307

Open kborowinski opened 9 hours ago

kborowinski commented 9 hours ago

PR Summary

This PR implements basic settings file for Ollama agent:

  1. The settings file is stored at $HOME\.aish\agent-config\ollama\ollama.config.json
  2. Currently only model and endpoint parameters are stored in the config file.
  3. Default config file is created when none exists:

    {
    // To use Ollama API service:
    // 1. Install Ollama:
    //      winget install Ollama.Ollama
    // 2. Start Ollama API server:
    //      ollama serve
    // 3. Install Ollama model:
    //      ollama pull phi3
    
    // Declare Ollama model
    "Model": "phi3",
    // Declare Ollama endpoint
    "Endpoint": "http://localhost:11434"
    }

PR Context

This PR allows user to specify custom Ollama model and Ollama enpoint location, instead of hardcoded values. This partially implements #155 (no history and streaming)

@StevenBucher98 : This is quick and dirty, but I needed the settings file ASAP to test the different models without constant agent recompilation. I'm open for suggestions.