LostRuins / koboldcpp

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.
https://github.com/lostruins/koboldcpp
GNU Affero General Public License v3.0
5.14k stars 353 forks source link

[OVERSIGHT] (Kobold v1.4.7) -> Wrong default ROPE for CodeLlama. #487

Closed SabinStargem closed 11 months ago

SabinStargem commented 1 year ago

The ROPE is supposed to be 1,000,000. The defaults used in KoboldCPP is 10,000. Airoboros 34b repeats with KoboldCPP, unless the proper rope is used.

LostRuins commented 1 year ago

If it's a GGUF model, it needs to be correctly configured by the model creator. KoboldCpp sets it based on the n_train_ctx value.

SabinStargem commented 1 year ago

Going by what I am seeing in the terminal, the model places the 1,000,000 under "freq_base_train".

image

Hm. Maybe the Bloke isn't building it right? That said, you would think Ooga and other clients would have complaints about the model not working.

LostRuins commented 1 year ago

That's the wrong parameter. The value that is important for rope scaling is n_ctx_train, aka. the training context. On this model, it is 16k, so it will use rope scaling of 1.0 10000 for 16k contextsize, aka. a 1 to 1 ratio.

You can customize it with --ropeconfig, of course.

LostRuins commented 1 year ago

Can you find another model that uses freq_base_train? If it's a commonly set parameter, I can include it in my calculations. So I will use n_ctx_train unless freq_base_train is specified, which would overwrite it.

SabinStargem commented 1 year ago

Here is Airoboros L2-70b, which has 10,000 as freq_base_train. Not sure if it is specific to the family. The value is probably generated from rope_theta, going by what I see in the config.json for the pytorches.

image

1,000,000 image

10,000 image

SabinStargem commented 1 year ago

I found another 34b model, Synthia v1.2. It has freq_base_train 1,000,000 in Kobold's terminal. The output is similar to what Airoboros 34b has, if the ROPE isn't customized.

image

EDIT: Also tested CodeLlama 7b. Also makes garbage without tweaking the ROPE. You should be able to use that model for testing, since it is a 7b.

LostRuins commented 11 months ago

This should now be fixed, please try v1.48

SabinStargem commented 11 months ago

Airoboros 34b successfully generated. :)