henk717 / KoboldAI

KoboldAI is generative AI software optimized for fictional use, but capable of much more!
http://koboldai.com
GNU Affero General Public License v3.0
359 stars 130 forks source link

RuntimeError: start (958) + length (1624) exceeds dimension size (2048). (with exllama) #463

Closed BlairSadewitz closed 11 months ago

BlairSadewitz commented 12 months ago

This happens when I try to use a context size over 2048, e.g. 4096+. (I have not found the exact point at which it starts, necessarily).

Here is the traceback (that's what this is called, right?) it dumps:

File "aiserver.py", line 3906, in generate genout, already_generated = tpool.execute(model.core_generate, txt, found_entries, gen_mode=gen_mode) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/eventlet/tpool.py", line 132, in execute six.reraise(c, e, tb) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/six.py", line 719, in reraise raise value File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/eventlet/tpool.py", line 86, in tworker rv = meth(*args, **kwargs) File "/opt/koboldai/modeling/inference_model.py", line 356, in core_generate result = self.raw_generate( File "/opt/koboldai/modeling/inference_model.py", line 629, in raw_generate result = self._raw_generate( File "/opt/koboldai/modeling/inference_models/exllama/class.py", line 308, in _raw_generate self.generator.gen_begin_reuse(gen_in) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/exllama/generator.py", line 223, in gen_begin_reuse if reuse < in_tokens.shape[-1]: self.gen_feed_tokens(in_tokens[:, reuse:], mask = mask) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/exllama/generator.py", line 243, in gen_feed_tokens self.model.forward(self.sequence[:, start : -1], self.cache, preprocess_only = True, lora = self.lora, input_mask = mask) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/exllama/model.py", line 966, in forward r = self._forward(input_ids[:, chunk_begin : chunk_end], File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/exllama/model.py", line 1052, in _forward hidden_states = decoder_layer.forward(hidden_states, cache, buffers[device], lora) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/exllama/model.py", line 530, in forward hidden_states = self.self_attn.forward(hidden_states, cache, buffer, lora) File "/opt/koboldai/runtime/envs/koboldai/lib/python3.8/site-packages/exllama/model.py", line 434, in forward new_keys = cache.key_states[self.index].narrow(2, past_len, q_len).narrow(0, 0, bsz) RuntimeError: start (958) + length (1624) exceeds dimension size (2048).

henk717 commented 12 months ago

Which model did you test this with?

GrennKren commented 12 months ago

Try adding an extra parameter with --model_parameters '{"max_ctx":4096}'

pi6am commented 12 months ago

You can also set the maximum context size from the UI when you load a model like this: image