n4ze3m / dialoqbase

Create chatbots with ease
https://dialoqbase.n4ze3m.com/
MIT License
1.54k stars 252 forks source link

context window to 16K #204

Closed ferstpetr777 closed 4 months ago

ferstpetr777 commented 5 months ago

Nazim hi! can you please tell me if there is a way to increase the context window to 16K in fireworks model Mixtral MoE 8x7B Instruct/ just 4000 is not enough for message capture of large texts.

n4ze3m commented 5 months ago

Hey, does Fireworks' Mixtral MoE 8x7B Instruct have a 16K context?

n4ze3m commented 5 months ago

Currently, they only have a 4000-context length.

image

ferstpetr777 commented 5 months ago

The model itself has this feature. Here's a screenshot from their official website. https://mistral.ai/news/mixtral-of-experts/

Снимок экрана 2024-01-26 в 20 21 05
ferstpetr777 commented 5 months ago

If there will be a context window in the context of 32k tokens, as stated in the model, then GPT is no longer needed, on tests this model shows the best results. With the right knowledge base and instructions. All that is needed is a large context window. Your tool dialog base is the best!

ferstpetr777 commented 5 months ago

Nazim has a question. How to add to dialogbase MISTRAL_API_KEY https://docs.mistral.ai/ ?

n4ze3m commented 5 months ago

The Mistral API is OpenAI-compatible. To proceed, navigate to the models section in settings and click on add new model button and paste the following URL: https://api.mistral.ai/v1, along with your API key.

ferstpetr777 commented 5 months ago

Nazim respect! I did everything according to the instructions, but the connection did not work). something went wrong).....

n4ze3m commented 5 months ago

What error did you get?

ferstpetr777 commented 5 months ago

there is a very large log ), please try it yourself when you have time, it's very complicated ...

ferstpetr777 commented 5 months ago

the whole log is at the google doc link https://docs.google.com/document/d/1iAsWInJdkFVGZ1o4MkcDsIZE2eW5rKtm8cszlghtfIY/edit?usp=sharing