groq is working absolutely fine but documentation is not updated so please update it.
Right now i was running in interactive mode with following command :
tgpt --provider groq --key xxxxxxxxxxxxx -i and was able to conversate and i believe by default it is mixtral which is absolutely fine and i and max wanted so if i use provider as just groq then it use mixtral but groq supports some more too like -
1) Llama 2 70B (4096 Context Length) 2)Llama 2 7B (2048 Context Length) 3) Gemma 7B (8K Context Length)
so are they also supported by default? if yes then how to select and use these model instead of mixtral and if it not supported by default for now then do you plan to add these accordingly
groq is working absolutely fine but documentation is not updated so please update it. Right now i was running in interactive mode with following command : tgpt --provider groq --key xxxxxxxxxxxxx -i and was able to conversate and i believe by default it is mixtral which is absolutely fine and i and max wanted so if i use provider as just groq then it use mixtral but groq supports some more too like - 1) Llama 2 70B (4096 Context Length) 2)Llama 2 7B (2048 Context Length) 3) Gemma 7B (8K Context Length)
so are they also supported by default? if yes then how to select and use these model instead of mixtral and if it not supported by default for now then do you plan to add these accordingly