LostRuins / koboldcpp

A simple one-file way to run various GGML and GGUF models with KoboldAI's UI
https://github.com/lostruins/koboldcpp
GNU Affero General Public License v3.0
4.38k stars 316 forks source link

[Feature request] Ability to cache context between runs for faster initial generation of the same history (after app restart) #445

Open aleksusklim opened 9 months ago

aleksusklim commented 9 months ago

I mean, the context of "current" generation, that is super-fast for regeneration of latest action, and for minor editing of history.

I propose adding an option/param, that points to local binary file. If existed, koboldcpp should read the context from there. While this option is active, koboldcpp should update this file with each new context after generation. (Read once at start, rewrite/append during runtime).

I can name two main reasons, why this will be extremely useful:

Yes, I understand that any accidental move – and I can easily destroy the cache (loading wrong history, adding a space from the start of text, etc.) which would ultimately lead to full regeneration. But! If I would play locally and alone, that would be only my own fault. Avoiding that, I can restart my system anytime, and continue playing instantly later.

Also, for use-case about big story templates, you might give an additional option for this context to be read-only, as in https://github.com/ggerganov/llama.cpp/pull/1640

LostRuins commented 7 months ago

The issue is not conceptual, rather it is the way the ring buffer is implemented. Llamacpp does not have this issue because they never allow users to manually remove tokens from the middle of the context. Although I suppose if you use it with n_keep then it may have the same issue too.

Why? Isn't it better to always resort to the server value? What is the point in having less context size in Lite?

There are situations where you don't need the full context, you can reduce it to allow unwanted early parts of the story to be truncated away.

Can it do that twice? Or for as much as it could, fitting each minibatch to the next continuous stride.

Sure, reduce your BLAS batch and predict less tokens at once.

In reality, I still don't understand what the model feels when it's memory is shifted. Does it "see" the gap? Imagine a context of 16k with 1k of memory and 3k of the active history at the end: will the model "understand" that there was 12k of "something" it cannot comprehend anymore, or it would see just 4k as direct concatenation?

Whatever that is shifted out is gone. There is no memory of it. You can see what the context contains by running it with --debugmode

If you are matching your context in cache against new user's prompt – then it should not matter, was the prompt truncated (from the beginning) or not: because even if it is truncated, but a true match is found – then your code should behave just as if it truncated that on its own. Why it is different?

When you shift tokens out of the context, you create "holes" inside the KV cache which later get filled with by new data. These holes have to be contiguous when handling processing for a large batch. I suppose it is possible that you can trigger this issue automatically too, but by default lite is designed to shift out the same number of new tokens that it attempts to generate, so the batch size should be the same size as the gap from any removed tokens letting it fit fine.

A better explanation would probably require looking through the shifting code here https://github.com/LostRuins/koboldcpp/blob/concedo/gpttype_adapter.cpp#L593

aleksusklim commented 7 months ago

by default lite is designed to shift out the same number of new tokens that it attempts to generate, so the batch size should be the same size as the gap from any removed tokens

Oh, interesting. Can this be also a reason why it is dangerous to edit something above the last turn?

For example, with low "amount to predict" (e.g. 128) when I edit "one turn above", the total amount of discarded tokens would be larger (maybe 200), but prediction batch is again 128. Wait, no, it should just shift 128 and be good again…

Whatever that is shifted out is gone. There is no memory of it.

But rotary position embeddings would be different for "text that had a gap" compared to its clean version (that's made by direct concatenation of "memory" and "the visible end of the history"), yeah? Or that doesn't matter much for the model's reasoning? (Thus, "it doesn't care" how older visible text was produced – with or without access to deleted parts – since now it sees only this visible text)

Sure, reduce your BLAS batch and predict less tokens at once.

If I'll set BLAS Batch Size: to Don't Batch BLAS should this work-around the problem?

I've tried to put --blasbatchsize -1 to my last example above and looks like it is not erroring anymore (updating the token counter each 8 tokens). Also there are no more regenerations from scratch, after I raised my threshold for cutting text to be higher than real context size (5000 versus 4096).

Hm-m, would it be possible to make batch-size adaptive automatically by largest hole in the cache?

Alternatively, maybe you should just disable blas-batching completely after the first context shifting? (Enabling it again when full re-evaluation happens for any reason). Because batching is very important during initial prompt ingestion, but not so when the user is playing turn-by-turn.

lite is designed to shift out the same number of new tokens that it attempts to generate

STILL (I wanted to say something here, but I've made further experiments, and now my results did not seem right again!)

Remember my command that I showed? I put -1 for batch size, done. Now remember my initial prompt with numbers, right from my previous logs? I send it and see 4080 tokens Then I send it again, I see 1 tokens I remove last two turns from the end: …USER: 61313977567118212\nMODEL: 455090158550265\nUSER: 286579237057171\nMODEL:…USER: 61313977567118212\nMODEL:

And send it. 4042 tokens

It regenerated! Why!? ContextShifting haven't even kicked in!

Console output ``` koboldcpp-1.50.1.exe --model sciphi-mistral-7b-32k.Q5_K_M.gguf --port 5001 --host 127.0.0.1 --launch --threads 8 --contextsize 4096 --blasbatchsize -1 --skiplauncher *** Welcome to KoboldCpp - Version 1.50.1 Attempting to use OpenBLAS library for faster prompt ingestion. A compatible libopenblas will be required. Initializing dynamic library: koboldcpp_openblas.dll ========== Namespace(bantokens=None, blasbatchsize=-1, blasthreads=8, config=None, contextsize=4096, debugmode=0, forceversion=0, foreground=False, gpulayers=0, highpriority=False, hordeconfig=None, host='127.0.0.1', launch=True, lora=None, model='sciphi-mistral-7b-32k.Q5_K_M.gguf', model_param='sciphi-mistral-7b-32k.Q5_K_M.gguf', multiuser=False, noavx2=False, noblas=False, nommap=False, noshift=False, onready='', port=5001, port_param=5001, preloadstory='', remotetunnel=False, ropeconfig=[0.0, 10000.0], skiplauncher=True, smartcontext=False, tensor_split=None, threads=8, useclblast=None, usecublas=None, usemlock=False) ========== Loading model: C:\NN\GPT\sciphi-mistral-7b-32k.Q5_K_M.gguf [Threads: 8, BlasThreads: 8, SmartContext: False, ContextShift: True] --- Identified as LLAMA model: (ver 6) Attempting to Load... --- Using automatic RoPE scaling. If the model has customized RoPE settings, they will be used directly instead! System Info: AVX = 1 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | llama_model_loader: loaded meta data with 20 key-value pairs and 291 tensors from C:\NN\GPT\sciphi-mistral-7b-32k.Q5_K_M.gguf (version GGUF V3 (latest)) llm_load_vocab: special tokens definition check successful ( 259/32000 ). llm_load_print_meta: format = GGUF V3 (latest) llm_load_print_meta: arch = llama llm_load_print_meta: vocab type = SPM llm_load_print_meta: n_vocab = 32000 llm_load_print_meta: n_merges = 0 llm_load_print_meta: n_ctx_train = 32768 llm_load_print_meta: n_embd = 4096 llm_load_print_meta: n_head = 32 llm_load_print_meta: n_head_kv = 8 llm_load_print_meta: n_layer = 32 llm_load_print_meta: n_rot = 128 llm_load_print_meta: n_gqa = 4 llm_load_print_meta: f_norm_eps = 0.0e+00 llm_load_print_meta: f_norm_rms_eps = 1.0e-05 llm_load_print_meta: f_clamp_kqv = 0.0e+00 llm_load_print_meta: f_max_alibi_bias = 0.0e+00 llm_load_print_meta: n_ff = 14336 llm_load_print_meta: rope scaling = linear llm_load_print_meta: freq_base_train = 10000.0 llm_load_print_meta: freq_scale_train = 1 llm_load_print_meta: n_yarn_orig_ctx = 32768 llm_load_print_meta: rope_finetuned = unknown llm_load_print_meta: model type = 7B llm_load_print_meta: model ftype = unknown, may not work llm_load_print_meta: model params = 7.24 B llm_load_print_meta: model size = 4.78 GiB (5.67 BPW) llm_load_print_meta: general.name = sciphi_sciphi-mistral-7b-32k llm_load_print_meta: BOS token = 1 '' llm_load_print_meta: EOS token = 2 '' llm_load_print_meta: UNK token = 0 '' llm_load_print_meta: LF token = 13 '<0x0A>' llm_load_tensors: ggml ctx size = 0.11 MiB llm_load_tensors: mem required = 4893.10 MiB .................................................................................................. Automatic RoPE Scaling: Using (scale:1.000, base:10000.0). llama_new_context_with_model: n_ctx = 4096 llama_new_context_with_model: freq_base = 10000.0 llama_new_context_with_model: freq_scale = 1 llama_new_context_with_model: kv self size = 512.00 MiB llama_build_graph: non-view tensors processed: 740/740 llama_new_context_with_model: compute buffer total size = 7.56 MiB Load Model OK: True Embedded Kobold Lite loaded. Starting Kobold HTTP Server on port 5001 Please connect to custom endpoint at http://127.0.0.1:5001 Input: {"memory": "SYSTEM: The user will give you a number and you should reply with any other number that is not present anywhere in this conversation. Ignore anything in square brackets.\n", "prompt": "USER: 6589112875847360\nMODEL: 5221120297854\nUSER: 55737552258767924\nMODEL: 171147791412787\nUSER: 6473628829923952\nMODEL: 113445676260994\nUSER: 754687669217837\nMODEL: 748415888935305\nUSER: 2943794556524911\nMODEL: 64963611977926924\nUSER: 749251432169676\nMODEL: 284915117750771\nUSER: 23762778361960\nMODEL: 726973414288314\nUSER: 6294815116415302\nMODEL: 831728945258813\nUSER: 2416355559566672\nMODEL: 769856832167515\nUSER: 833505019083878\nMODEL: 428935819673835\nUSER: 196433518850818\nMODEL: 694466406946425\nUSER: 6693492952250681\nMODEL: 9886538762579598\nUSER: 31153191377989\nMODEL: 528992611081949\nUSER: 4948976292578274\nMODEL: 641324003629054\nUSER: 647151845961076\nMODEL: 473586293516108\nUSER: 694665244326346\nMODEL: 325243656438017\nUSER: 89149614037735700\nMODEL: 172925058341368\nUSER: 2584934113846547\nMODEL: 535438810876705\nUSER: 1489972515969589\nMODEL: 61313015685095210\nUSER: 7145163881248354\nMODEL: 912529620891338\nUSER: 1986969199340750\nMODEL: 392667333300659\nUSER: 66362586577268881\nMODEL: 745394151580354\nUSER: 5259222947359748\nMODEL: 924666859391499\nUSER: 4733666166492483\nMODEL: 62397363865373824\nUSER: 9752783678768986\nMODEL: 819016930961027\nUSER: 699319256229545\nMODEL: 152478105285539\nUSER: 258776952588553\nMODEL: 4275479082985044\nUSER: 243631276329791\nMODEL: 7736295332426448\nUSER: 2988314542928845\nMODEL: 563196513898948\nUSER: 8389394339524729\nMODEL: 594815822316908\nUSER: 418656346732644\nMODEL: 4587852316336852\nUSER: 549984151420547\nMODEL: 4697674727525023\nUSER: 654313914600816\nMODEL: 4742493919950903\nUSER: 1621996554049821\nMODEL: 6318661835040562\nUSER: 524955549679593\nMODEL: 5922217804767002\nUSER: 4656781654151793\nMODEL: 427839920435515\nUSER: 319211200347996\nMODEL: 192359974204076\nUSER: 4272035744957720\nMODEL: 124197499768913\nUSER: 189183048487357\nMODEL: 49136154247452141\nUSER: 16864237489539210\nMODEL: 515625417462163\nUSER: 572945385808785\nMODEL: 4481827414978663\nUSER: 3952258861878379\nMODEL: 486910226996774\nUSER: 5876569231906082\nMODEL: 698476162578063\nUSER: 251882928198813\nMODEL: 364671043585596\nUSER: 16975418640787\nMODEL: 4912821759117671\nUSER: 83272557195384\nMODEL: 9295745573112777\nUSER: 426978695244746\nMODEL: 961981996398428\nUSER: 2132463855293933\nMODEL: 5965897461137514\nUSER: 926999987766368\nMODEL: 442597274669536\nUSER: 815198772437867\nMODEL: 71233591097281\nUSER: 3153265826758165\nMODEL: 7384456628567550\nUSER: 6677311435351158\nMODEL: 734859195573746\nUSER: 64583116598439864\nMODEL: 256247833009836\nUSER: 589393213265962\nMODEL: 34739236277631673\nUSER: 434427170537359\nMODEL: 872904795759777\nUSER: 757893887877249\nMODEL: 18523307781765\nUSER: 897775123887549\nMODEL: 887517159447797\nUSER: 63582277633003\nMODEL: 876231948497173\nUSER: 2364421836534466\nMODEL: 255311527852424\nUSER: 56339479702334550\nMODEL: 534151378751267\nUSER: 33034355770796210\nMODEL: 733691172768323\nUSER: 16211564648985\nMODEL: 47695153174872120\nUSER: 112631723916226\nMODEL: 1784638711295587\nUSER: 4129721463371851\nMODEL: 448679222951488\nUSER: 645333782391216\nMODEL: 155467133819039\nUSER: 8281313719284649\nMODEL: 3859701333774090\nUSER: 331125178101999\nMODEL: 92338960080935\nUSER: 651287714116996\nMODEL: 293788175446379\nUSER: 2854464637588240\nMODEL: 5924652888685462\nUSER: 724743617987274\nMODEL: 4916578932369290\nUSER: 49851798134521550\nMODEL: 527831739468322\nUSER: 1224278690343970\nMODEL: 5296913224356797\nUSER: 4258051597604860\nMODEL: 6469508497476612\nUSER: 754329054551748\nMODEL: 7158694381424794\nUSER: 751721111978474\nMODEL: 243816249188662\nUSER: 89119889554712\nMODEL: 848360204984756\nUSER: 4857527978860032\nMODEL: 6652830021988721\nUSER: 155323414839053\nMODEL: 839598492654557\nUSER: 5415381506664400\nMODEL: 643866384927243\nUSER: 749562904644214\nMODEL: 232050713935745\nUSER: 566816127332784\nMODEL: 4782566244729842\nUSER: 777657102314182\nMODEL: 45393179748428734\nUSER: 8171347836675464\nMODEL: 33282797279561\nUSER: 446588374499466\nMODEL: 945885207228935\nUSER: 4324566637175341\nMODEL: 782475600189972\nUSER: 999678232734222\nMODEL: 559986956532762\nUSER: 3578955240656490\nMODEL: 7367661787229494\nUSER: 6182794971652132\nMODEL: 5497383639689334\nUSER: 184188481168328\nMODEL: 684898728028691\nUSER: 710066863496316\nMODEL: 178561892522103\nUSER: 721286538628646\nMODEL: 6153492475927843\nUSER: 748671916689253\nMODEL: 57737673390369\nUSER: 2166581726046430\nMODEL: 914341311488645\nUSER: 767175911550435\nMODEL: 4862471794581228\nUSER: 4816153619387333\nMODEL: 6233658331291324\nUSER: 181463757661364\nMODEL: 31696266683155243\nUSER: 164894688181397\nMODEL: 313737856393608\nUSER: 26928122666465\nMODEL: 89985583667815\nUSER: 5957731164995686\nMODEL: 5188397246239\nUSER: 57542406129455\nMODEL: 586442737560384\nUSER: 133382969555969\nMODEL: 39754834854476331\nUSER: 768558798693312\nMODEL: 4356168414926503\nUSER: 4944886520099324\nMODEL: 5545782699010750\nUSER: 496868161177635\nMODEL: 21253571473618\nUSER: 852538164809696\nMODEL: 5768991864419517\nUSER: 71337773819459\nMODEL: 4841001018254991\nUSER: 766677272745725\nMODEL: 868253784796224\nUSER: 6919415487587923\nMODEL: 3835170330395410\nUSER: 4217188443239073\nMODEL: 7525358295705790\nUSER: 5751539646501820\nMODEL: 698562246650857\nUSER: 3381148171068\nMODEL: 939232606281694\nUSER: 4135792889197922\nMODEL: 96245578293568\nUSER: 26285978096562\nMODEL: 919733026233356\nUSER: 55287747514393\nMODEL: 691967756254688\nUSER: 4471579437738593\nMODEL: 762385496636856\nUSER: 923589972644988\nMODEL: 65959528338256\nUSER: 5477036817124270\nMODEL: 9475999990727340\nUSER: 696516972887026\nMODEL: 519449351602881\nUSER: 153697233971599\nMODEL: 115114517353716\nUSER: 754624618161429\nMODEL: 8329643672030112\nUSER: 749334762278195\nMODEL: 994992150647051\nUSER: 225233472858199\nMODEL: 543055627760971\nUSER: 67568833806256\nMODEL: 647739288869667\nUSER: 835826771426138\nMODEL: 94562158357263\nUSER: 695671316439557\nMODEL: 994539185333055\nUSER: 61313977567118212\nMODEL: 455090158550265\nUSER: 286579237057171\nMODEL: ", "stop_sequence": ["\n"], "genkey": "KCPP2261", "max_context_length": 4096, "max_length": 16, "sampler_order": [6, 0, 1, 3, 4, 2, 5], "rep_pen_range": 1024, "rep_pen_slope": 0.7, "n": 1, "temperature": 0.85, "min_p": 0.25, "rep_pen": 1.1, "top_p": 1, "top_k": 0, "top_a": 0, "typical": 1, "tfs": 1, "quiet": true, "use_default_badwordsids": false} Processing Prompt [BLAS] (4080 / 4080 tokens) Generating (15 / 16 tokens) (Stop sequence triggered: \n) ContextLimit: 4095/4096, Processing:334.42s (82.0ms/T), Generation:2.11s (140.9ms/T), Total:336.53s (0.04T/s) Output: 97632138684420 Input: {"memory": "SYSTEM: The user will give you a number and you should reply with any other number that is not present anywhere in this conversation. Ignore anything in square brackets.\n", "prompt": "USER: 6589112875847360\nMODEL: 5221120297854\nUSER: 55737552258767924\nMODEL: 171147791412787\nUSER: 6473628829923952\nMODEL: 113445676260994\nUSER: 754687669217837\nMODEL: 748415888935305\nUSER: 2943794556524911\nMODEL: 64963611977926924\nUSER: 749251432169676\nMODEL: 284915117750771\nUSER: 23762778361960\nMODEL: 726973414288314\nUSER: 6294815116415302\nMODEL: 831728945258813\nUSER: 2416355559566672\nMODEL: 769856832167515\nUSER: 833505019083878\nMODEL: 428935819673835\nUSER: 196433518850818\nMODEL: 694466406946425\nUSER: 6693492952250681\nMODEL: 9886538762579598\nUSER: 31153191377989\nMODEL: 528992611081949\nUSER: 4948976292578274\nMODEL: 641324003629054\nUSER: 647151845961076\nMODEL: 473586293516108\nUSER: 694665244326346\nMODEL: 325243656438017\nUSER: 89149614037735700\nMODEL: 172925058341368\nUSER: 2584934113846547\nMODEL: 535438810876705\nUSER: 1489972515969589\nMODEL: 61313015685095210\nUSER: 7145163881248354\nMODEL: 912529620891338\nUSER: 1986969199340750\nMODEL: 392667333300659\nUSER: 66362586577268881\nMODEL: 745394151580354\nUSER: 5259222947359748\nMODEL: 924666859391499\nUSER: 4733666166492483\nMODEL: 62397363865373824\nUSER: 9752783678768986\nMODEL: 819016930961027\nUSER: 699319256229545\nMODEL: 152478105285539\nUSER: 258776952588553\nMODEL: 4275479082985044\nUSER: 243631276329791\nMODEL: 7736295332426448\nUSER: 2988314542928845\nMODEL: 563196513898948\nUSER: 8389394339524729\nMODEL: 594815822316908\nUSER: 418656346732644\nMODEL: 4587852316336852\nUSER: 549984151420547\nMODEL: 4697674727525023\nUSER: 654313914600816\nMODEL: 4742493919950903\nUSER: 1621996554049821\nMODEL: 6318661835040562\nUSER: 524955549679593\nMODEL: 5922217804767002\nUSER: 4656781654151793\nMODEL: 427839920435515\nUSER: 319211200347996\nMODEL: 192359974204076\nUSER: 4272035744957720\nMODEL: 124197499768913\nUSER: 189183048487357\nMODEL: 49136154247452141\nUSER: 16864237489539210\nMODEL: 515625417462163\nUSER: 572945385808785\nMODEL: 4481827414978663\nUSER: 3952258861878379\nMODEL: 486910226996774\nUSER: 5876569231906082\nMODEL: 698476162578063\nUSER: 251882928198813\nMODEL: 364671043585596\nUSER: 16975418640787\nMODEL: 4912821759117671\nUSER: 83272557195384\nMODEL: 9295745573112777\nUSER: 426978695244746\nMODEL: 961981996398428\nUSER: 2132463855293933\nMODEL: 5965897461137514\nUSER: 926999987766368\nMODEL: 442597274669536\nUSER: 815198772437867\nMODEL: 71233591097281\nUSER: 3153265826758165\nMODEL: 7384456628567550\nUSER: 6677311435351158\nMODEL: 734859195573746\nUSER: 64583116598439864\nMODEL: 256247833009836\nUSER: 589393213265962\nMODEL: 34739236277631673\nUSER: 434427170537359\nMODEL: 872904795759777\nUSER: 757893887877249\nMODEL: 18523307781765\nUSER: 897775123887549\nMODEL: 887517159447797\nUSER: 63582277633003\nMODEL: 876231948497173\nUSER: 2364421836534466\nMODEL: 255311527852424\nUSER: 56339479702334550\nMODEL: 534151378751267\nUSER: 33034355770796210\nMODEL: 733691172768323\nUSER: 16211564648985\nMODEL: 47695153174872120\nUSER: 112631723916226\nMODEL: 1784638711295587\nUSER: 4129721463371851\nMODEL: 448679222951488\nUSER: 645333782391216\nMODEL: 155467133819039\nUSER: 8281313719284649\nMODEL: 3859701333774090\nUSER: 331125178101999\nMODEL: 92338960080935\nUSER: 651287714116996\nMODEL: 293788175446379\nUSER: 2854464637588240\nMODEL: 5924652888685462\nUSER: 724743617987274\nMODEL: 4916578932369290\nUSER: 49851798134521550\nMODEL: 527831739468322\nUSER: 1224278690343970\nMODEL: 5296913224356797\nUSER: 4258051597604860\nMODEL: 6469508497476612\nUSER: 754329054551748\nMODEL: 7158694381424794\nUSER: 751721111978474\nMODEL: 243816249188662\nUSER: 89119889554712\nMODEL: 848360204984756\nUSER: 4857527978860032\nMODEL: 6652830021988721\nUSER: 155323414839053\nMODEL: 839598492654557\nUSER: 5415381506664400\nMODEL: 643866384927243\nUSER: 749562904644214\nMODEL: 232050713935745\nUSER: 566816127332784\nMODEL: 4782566244729842\nUSER: 777657102314182\nMODEL: 45393179748428734\nUSER: 8171347836675464\nMODEL: 33282797279561\nUSER: 446588374499466\nMODEL: 945885207228935\nUSER: 4324566637175341\nMODEL: 782475600189972\nUSER: 999678232734222\nMODEL: 559986956532762\nUSER: 3578955240656490\nMODEL: 7367661787229494\nUSER: 6182794971652132\nMODEL: 5497383639689334\nUSER: 184188481168328\nMODEL: 684898728028691\nUSER: 710066863496316\nMODEL: 178561892522103\nUSER: 721286538628646\nMODEL: 6153492475927843\nUSER: 748671916689253\nMODEL: 57737673390369\nUSER: 2166581726046430\nMODEL: 914341311488645\nUSER: 767175911550435\nMODEL: 4862471794581228\nUSER: 4816153619387333\nMODEL: 6233658331291324\nUSER: 181463757661364\nMODEL: 31696266683155243\nUSER: 164894688181397\nMODEL: 313737856393608\nUSER: 26928122666465\nMODEL: 89985583667815\nUSER: 5957731164995686\nMODEL: 5188397246239\nUSER: 57542406129455\nMODEL: 586442737560384\nUSER: 133382969555969\nMODEL: 39754834854476331\nUSER: 768558798693312\nMODEL: 4356168414926503\nUSER: 4944886520099324\nMODEL: 5545782699010750\nUSER: 496868161177635\nMODEL: 21253571473618\nUSER: 852538164809696\nMODEL: 5768991864419517\nUSER: 71337773819459\nMODEL: 4841001018254991\nUSER: 766677272745725\nMODEL: 868253784796224\nUSER: 6919415487587923\nMODEL: 3835170330395410\nUSER: 4217188443239073\nMODEL: 7525358295705790\nUSER: 5751539646501820\nMODEL: 698562246650857\nUSER: 3381148171068\nMODEL: 939232606281694\nUSER: 4135792889197922\nMODEL: 96245578293568\nUSER: 26285978096562\nMODEL: 919733026233356\nUSER: 55287747514393\nMODEL: 691967756254688\nUSER: 4471579437738593\nMODEL: 762385496636856\nUSER: 923589972644988\nMODEL: 65959528338256\nUSER: 5477036817124270\nMODEL: 9475999990727340\nUSER: 696516972887026\nMODEL: 519449351602881\nUSER: 153697233971599\nMODEL: 115114517353716\nUSER: 754624618161429\nMODEL: 8329643672030112\nUSER: 749334762278195\nMODEL: 994992150647051\nUSER: 225233472858199\nMODEL: 543055627760971\nUSER: 67568833806256\nMODEL: 647739288869667\nUSER: 835826771426138\nMODEL: 94562158357263\nUSER: 695671316439557\nMODEL: 994539185333055\nUSER: 61313977567118212\nMODEL: 455090158550265\nUSER: 286579237057171\nMODEL: ", "stop_sequence": ["\n"], "genkey": "KCPP2261", "max_context_length": 4096, "max_length": 16, "sampler_order": [6, 0, 1, 3, 4, 2, 5], "rep_pen_range": 1024, "rep_pen_slope": 0.7, "n": 1, "temperature": 0.85, "min_p": 0.25, "rep_pen": 1.1, "top_p": 1, "top_k": 0, "top_a": 0, "typical": 1, "tfs": 1, "quiet": true, "use_default_badwordsids": false} Processing Prompt (1 / 1 tokens) Generating (15 / 16 tokens) (Stop sequence triggered: \n) ContextLimit: 4095/4096, Processing:0.16s (157.0ms/T), Generation:2.04s (135.9ms/T), Total:2.20s (6.83T/s) Output: 86343288197101 Input: {"memory": "SYSTEM: The user will give you a number and you should reply with any other number that is not present anywhere in this conversation. Ignore anything in square brackets.\n", "prompt": "USER: 6589112875847360\nMODEL: 5221120297854\nUSER: 55737552258767924\nMODEL: 171147791412787\nUSER: 6473628829923952\nMODEL: 113445676260994\nUSER: 754687669217837\nMODEL: 748415888935305\nUSER: 2943794556524911\nMODEL: 64963611977926924\nUSER: 749251432169676\nMODEL: 284915117750771\nUSER: 23762778361960\nMODEL: 726973414288314\nUSER: 6294815116415302\nMODEL: 831728945258813\nUSER: 2416355559566672\nMODEL: 769856832167515\nUSER: 833505019083878\nMODEL: 428935819673835\nUSER: 196433518850818\nMODEL: 694466406946425\nUSER: 6693492952250681\nMODEL: 9886538762579598\nUSER: 31153191377989\nMODEL: 528992611081949\nUSER: 4948976292578274\nMODEL: 641324003629054\nUSER: 647151845961076\nMODEL: 473586293516108\nUSER: 694665244326346\nMODEL: 325243656438017\nUSER: 89149614037735700\nMODEL: 172925058341368\nUSER: 2584934113846547\nMODEL: 535438810876705\nUSER: 1489972515969589\nMODEL: 61313015685095210\nUSER: 7145163881248354\nMODEL: 912529620891338\nUSER: 1986969199340750\nMODEL: 392667333300659\nUSER: 66362586577268881\nMODEL: 745394151580354\nUSER: 5259222947359748\nMODEL: 924666859391499\nUSER: 4733666166492483\nMODEL: 62397363865373824\nUSER: 9752783678768986\nMODEL: 819016930961027\nUSER: 699319256229545\nMODEL: 152478105285539\nUSER: 258776952588553\nMODEL: 4275479082985044\nUSER: 243631276329791\nMODEL: 7736295332426448\nUSER: 2988314542928845\nMODEL: 563196513898948\nUSER: 8389394339524729\nMODEL: 594815822316908\nUSER: 418656346732644\nMODEL: 4587852316336852\nUSER: 549984151420547\nMODEL: 4697674727525023\nUSER: 654313914600816\nMODEL: 4742493919950903\nUSER: 1621996554049821\nMODEL: 6318661835040562\nUSER: 524955549679593\nMODEL: 5922217804767002\nUSER: 4656781654151793\nMODEL: 427839920435515\nUSER: 319211200347996\nMODEL: 192359974204076\nUSER: 4272035744957720\nMODEL: 124197499768913\nUSER: 189183048487357\nMODEL: 49136154247452141\nUSER: 16864237489539210\nMODEL: 515625417462163\nUSER: 572945385808785\nMODEL: 4481827414978663\nUSER: 3952258861878379\nMODEL: 486910226996774\nUSER: 5876569231906082\nMODEL: 698476162578063\nUSER: 251882928198813\nMODEL: 364671043585596\nUSER: 16975418640787\nMODEL: 4912821759117671\nUSER: 83272557195384\nMODEL: 9295745573112777\nUSER: 426978695244746\nMODEL: 961981996398428\nUSER: 2132463855293933\nMODEL: 5965897461137514\nUSER: 926999987766368\nMODEL: 442597274669536\nUSER: 815198772437867\nMODEL: 71233591097281\nUSER: 3153265826758165\nMODEL: 7384456628567550\nUSER: 6677311435351158\nMODEL: 734859195573746\nUSER: 64583116598439864\nMODEL: 256247833009836\nUSER: 589393213265962\nMODEL: 34739236277631673\nUSER: 434427170537359\nMODEL: 872904795759777\nUSER: 757893887877249\nMODEL: 18523307781765\nUSER: 897775123887549\nMODEL: 887517159447797\nUSER: 63582277633003\nMODEL: 876231948497173\nUSER: 2364421836534466\nMODEL: 255311527852424\nUSER: 56339479702334550\nMODEL: 534151378751267\nUSER: 33034355770796210\nMODEL: 733691172768323\nUSER: 16211564648985\nMODEL: 47695153174872120\nUSER: 112631723916226\nMODEL: 1784638711295587\nUSER: 4129721463371851\nMODEL: 448679222951488\nUSER: 645333782391216\nMODEL: 155467133819039\nUSER: 8281313719284649\nMODEL: 3859701333774090\nUSER: 331125178101999\nMODEL: 92338960080935\nUSER: 651287714116996\nMODEL: 293788175446379\nUSER: 2854464637588240\nMODEL: 5924652888685462\nUSER: 724743617987274\nMODEL: 4916578932369290\nUSER: 49851798134521550\nMODEL: 527831739468322\nUSER: 1224278690343970\nMODEL: 5296913224356797\nUSER: 4258051597604860\nMODEL: 6469508497476612\nUSER: 754329054551748\nMODEL: 7158694381424794\nUSER: 751721111978474\nMODEL: 243816249188662\nUSER: 89119889554712\nMODEL: 848360204984756\nUSER: 4857527978860032\nMODEL: 6652830021988721\nUSER: 155323414839053\nMODEL: 839598492654557\nUSER: 5415381506664400\nMODEL: 643866384927243\nUSER: 749562904644214\nMODEL: 232050713935745\nUSER: 566816127332784\nMODEL: 4782566244729842\nUSER: 777657102314182\nMODEL: 45393179748428734\nUSER: 8171347836675464\nMODEL: 33282797279561\nUSER: 446588374499466\nMODEL: 945885207228935\nUSER: 4324566637175341\nMODEL: 782475600189972\nUSER: 999678232734222\nMODEL: 559986956532762\nUSER: 3578955240656490\nMODEL: 7367661787229494\nUSER: 6182794971652132\nMODEL: 5497383639689334\nUSER: 184188481168328\nMODEL: 684898728028691\nUSER: 710066863496316\nMODEL: 178561892522103\nUSER: 721286538628646\nMODEL: 6153492475927843\nUSER: 748671916689253\nMODEL: 57737673390369\nUSER: 2166581726046430\nMODEL: 914341311488645\nUSER: 767175911550435\nMODEL: 4862471794581228\nUSER: 4816153619387333\nMODEL: 6233658331291324\nUSER: 181463757661364\nMODEL: 31696266683155243\nUSER: 164894688181397\nMODEL: 313737856393608\nUSER: 26928122666465\nMODEL: 89985583667815\nUSER: 5957731164995686\nMODEL: 5188397246239\nUSER: 57542406129455\nMODEL: 586442737560384\nUSER: 133382969555969\nMODEL: 39754834854476331\nUSER: 768558798693312\nMODEL: 4356168414926503\nUSER: 4944886520099324\nMODEL: 5545782699010750\nUSER: 496868161177635\nMODEL: 21253571473618\nUSER: 852538164809696\nMODEL: 5768991864419517\nUSER: 71337773819459\nMODEL: 4841001018254991\nUSER: 766677272745725\nMODEL: 868253784796224\nUSER: 6919415487587923\nMODEL: 3835170330395410\nUSER: 4217188443239073\nMODEL: 7525358295705790\nUSER: 5751539646501820\nMODEL: 698562246650857\nUSER: 3381148171068\nMODEL: 939232606281694\nUSER: 4135792889197922\nMODEL: 96245578293568\nUSER: 26285978096562\nMODEL: 919733026233356\nUSER: 55287747514393\nMODEL: 691967756254688\nUSER: 4471579437738593\nMODEL: 762385496636856\nUSER: 923589972644988\nMODEL: 65959528338256\nUSER: 5477036817124270\nMODEL: 9475999990727340\nUSER: 696516972887026\nMODEL: 519449351602881\nUSER: 153697233971599\nMODEL: 115114517353716\nUSER: 754624618161429\nMODEL: 8329643672030112\nUSER: 749334762278195\nMODEL: 994992150647051\nUSER: 225233472858199\nMODEL: 543055627760971\nUSER: 67568833806256\nMODEL: 647739288869667\nUSER: 835826771426138\nMODEL: 94562158357263\nUSER: 695671316439557\nMODEL: 994539185333055\nUSER: 61313977567118212\nMODEL: ", "stop_sequence": ["\n"], "genkey": "KCPP2261", "max_context_length": 4096, "max_length": 16, "sampler_order": [6, 0, 1, 3, 4, 2, 5], "rep_pen_range": 1024, "rep_pen_slope": 0.7, "n": 1, "temperature": 0.85, "min_p": 0.25, "rep_pen": 1.1, "top_p": 1, "top_k": 0, "top_a": 0, "typical": 1, "tfs": 1, "quiet": true, "use_default_badwordsids": false} Processing Prompt [BLAS] (… / 4042 tokens) ```