marcolardera / chatgpt-cli

Simple yet effective command line client for chatting with ChatGPT using the official API
MIT License
440 stars 81 forks source link

Consider adding more parameters to the config file #16

Open AndreaPi opened 1 year ago

AndreaPi commented 1 year ago

Hi,

I like your minimalistic approach a lot! But the lack of a few configurable parameters made me switch to https://github.com/j178/chatgpt. If you could add the following parameters to the yaml file:

  "prompts": {
    "default": "You are a helpful assistant"
     "pirate": "You are pirate Blackbeard. Arr matey!"},
  "conversation": {
    "prompt": "default",
    "stream": true,
    "max_tokens": 1024,
    "temperature": 0
  }

I would be happy to switch back! Basically, this is adding the following functionalities:

  1. the possibility to write down one or more contexts in the yaml file. This is a bit more convenient than having to carry around a separate file for each context, and pass them via --context <FILE PATH>
  2. stream al1lows the tokens to be sent as they become available, instead than all at once at the end of the reply. This makes quite the difference with long responses and slower models such as GPT-4
  3. max_tokens is self-explanatory 🙂 and it also makes quite the difference when using GPT-4.
  4. temperature set to 0 allows deterministic responses (fundamental for reproducibility. From 0< to 2, it allows increasingly more creative but also less focused.

These are very simple modifications, you just need to read them from the yaml file and add them as extra parameters when posting the request. Thanks!

marcolardera commented 1 year ago

Hi, thank you for your feedback! I think these are all very useful enhancements.

I just implemented 3 and 4, temperature and max_tokens parameters, with the last commit ( https://github.com/marcolardera/chatgpt-cli/commit/362baded219d3000086ae7321e9fa3c661b54619 ).

1 is easy, I will work on it as soon as I have a bit of time.

2 also seems a cool feature but I need to study a bit how to render the streaming response in the console.

AndreaPi commented 1 year ago

Great! Looking forward to the implementation of 1 and 2. Regarding this last one, I understand it's a bit more complicated, but it would really enhance usability a lot. For what it concerns rendering, since you use rich (good choice 👍) this could help

https://rich.readthedocs.io/en/stable/live.html