sigoden / aichat

All-in-one AI CLI tool featuring Chat-REPL, Shell Assistant, RAG, AI tools & agents, with access to OpenAI, Claude, Gemini, Ollama, Groq, and more.
Apache License 2.0
3.98k stars 265 forks source link

what is the default model? setup a default model #361

Closed engie-b2c-perf closed 6 months ago

engie-b2c-perf commented 6 months ago

Is your feature request related to a problem? Please describe. it is not really a problem, but I am confused about which model will be used by default in case I have various models set up.

That's my config.yaml :

clients:
  - type: vertexai
    api_base: https://{REGION}-aiplatform.googleapis.com/v1/projects/{PROJECT_ID}/locations/{REGION}/publishers/google/models
  - type: ollama
    api_base: http://localhost:11434
    models:
    - name: llama2
      max_input_tokens: null

Modles are recognized:

$ aichat --list-models
vertexai:gemini-1.0-pro
vertexai:gemini.1.0-pro-vision
vertexai:gemini-1.0-ultra
vertexai:gemini.1.0-ultra-vision
vertexai:gemini-1.5-pro
ollama:llama2
$ aichat
>
# what is the active default model here? 

Describe the solution you'd like I would like:

Describe alternatives you've considered I tried playing with config.yaml to do that but it didn't work

Tried to force one model like this:

clients:
  - type: vertexai
    api_base: https://{REGION}-aiplatform.googleapis.com/v1/projects/{PROJECT_ID}/locations/{REGION}/publishers/google/models
    models:
    - name: vertexai:gemini-1.0-pro
  - type: ollama
    api_base: http://localhost:11434
    models:
    - name: llama2
      max_input_tokens: null

Thanks for the help! Your tool is great

chiefMarlin commented 6 months ago

If i am not mistaken, default model is configured at the top with model directive Eg: model: claude:claude-3-opus-20240229

engie-b2c-perf commented 6 months ago

If i am not mistaken, default model is confiured at the model top with model directive Eg: model: claude:claude-3-opus-20240229

You are totally right @chiefMarlin. It works! thanks 👍

If I may, can we leave this issue open for the two other questions :

I would like:

  • to know what the current model is in chat mode
  • bonus: to restrict models that can be used for one client. For example, here I could restrict vertexai to only one model:
    $ aichat --list-models
    vertexai:gemini-1.0-pro
    ollama:llama2
sigoden commented 6 months ago
  1. to know what the current model is in chat mode, use aichat --info (command mode) or .info (REPL mode).
  2. aichat does not support restricting the client's available models. This doesn't make much sense.