Closed ferstpetr777 closed 4 months ago
Hey, does Fireworks' Mixtral MoE 8x7B Instruct
have a 16K context?
Currently, they only have a 4000-context length.
The model itself has this feature. Here's a screenshot from their official website. https://mistral.ai/news/mixtral-of-experts/
If there will be a context window in the context of 32k tokens, as stated in the model, then GPT is no longer needed, on tests this model shows the best results. With the right knowledge base and instructions. All that is needed is a large context window. Your tool dialog base is the best!
Nazim has a question. How to add to dialogbase MISTRAL_API_KEY https://docs.mistral.ai/ ?
The Mistral API is OpenAI-compatible. To proceed, navigate to the models section in settings and click on add new model button and paste the following URL: https://api.mistral.ai/v1
, along with your API key.
Nazim respect! I did everything according to the instructions, but the connection did not work). something went wrong).....
What error did you get?
there is a very large log ), please try it yourself when you have time, it's very complicated ...
the whole log is at the google doc link https://docs.google.com/document/d/1iAsWInJdkFVGZ1o4MkcDsIZE2eW5rKtm8cszlghtfIY/edit?usp=sharing
Nazim hi! can you please tell me if there is a way to increase the context window to 16K in fireworks model Mixtral MoE 8x7B Instruct/ just 4000 is not enough for message capture of large texts.