01-ai / Yi-Coder

🌟 Yi-Coder is a series of open-source code language models that delivers state-of-the-art coding performance with fewer than 10 billion parameters.
361 stars 26 forks source link

Output tokens? #6

Closed tmikaeld closed 2 months ago

tmikaeld commented 2 months ago

What's the max output tokens the model can produce?

tmikaeld commented 2 months ago

Seems to be 1024, can't get it to go over that

Haijian06 commented 2 months ago

Hi @tmikaeld 👋 Our model currently supports a maximum context length of 128k tokens. Larger context windows (higher max_token values) require significantly more GPU memory. In the provided examples, we've used a context length of 1024 tokens to make it easier for more people to quickly experience the capabilities of the model.

tmikaeld commented 2 months ago

When i try it with ollama, the model doesnt output more than 1024. Do you know what can cause this?

tmikaeld commented 2 months ago

This isn't very clear anywhere, but in Continue extension for VSCode, add:

{
 "completionOptions": {
        "maxTokens": 4096
 }
}

On the model config parameters for Ollama.