carlrobertoh / CodeGPT

The leading open-source AI copilot for JetBrains. Connect to any model in any environment, and customize your coding experience in any way you like.
https://codegpt.ee
Apache License 2.0
1.01k stars 213 forks source link

New context size limit for Llama 3 in settings #572

Open sdudzin opened 4 months ago

sdudzin commented 4 months ago

What happened?

LLama 3 has new context size of 8192 as opposed to 4096 for Llama 2. However the settings dialog limits the context size of 4096 max, also the text description says the limit is 2048.

Relevant log output or stack trace

No response

Steps to reproduce

No response

CodeGPT version

2.7.1

Operating System

macOS

PhilKes commented 4 months ago

You are right, the UI text is a bit misleading/unhelpful and the context size should be model dependent. The question is should we define hard context size limits at all? The thing is you can use any model with a larger context size that they have been trained on, usually completion quality just gets much worse at some point, but still if a user wants to do that, why not?

I think I would prefer to not put a hard limit onto the prompt context size input field, and instead only put a hint below it with the context size the model has been trained on (e.g. for Llama 3 8k, CodeQwen 64k, etc.), or even set it as a default value, instead of defining a max value.