Open CasualAutopsy opened 4 days ago
Update: I've noticed this pops up every time I use grammar for the first time each boot up of Ooba:
Warning: unrecognized tokenizer: using default token formatting
I've also got a traceback this time instead of an endless generation:
Traceback (most recent call last):
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\modules\callbacks.py", line 61, in gentask
ret = self.mfunc(callback=_callback, *args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\modules\text_generation.py", line 398, in generate_with_callback
shared.model.generate(**kwargs)
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\transformers\generation\utils.py", line 2048, in generate
result = self._sample(
^^^^^^^^^^^^^
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\transformers\generation\utils.py", line 3018, in _sample
next_token_scores = logits_processor(input_ids, next_token_logits)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\transformers\generation\logits_process.py", line 104, in __call__
scores = processor(input_ids, scores)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\transformers\generation\logits_process.py", line 843, in __call__
entropy = torch.distributions.Categorical(logits=scores).entropy()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\torch\distributions\categorical.py", line 71, in __init__
super().__init__(batch_shape, validate_args=validate_args)
File "C:\Users\CasAu\OneDrive\Desktop\AI_Playground\SillyTavern-Launcher\text-completion\text-generation-webui\installer_files\env\Lib\site-packages\torch\distributions\distribution.py", line 70, in __init__
raise ValueError(
ValueError: Expected parameter logits (Tensor of shape (1, 128256)) of distribution Categorical(logits: torch.Size([1, 128256])) to satisfy the constraint IndependentConstraint(Real(), 1), but found invalid values:
tensor([[nan, nan, nan, ..., nan, nan, nan]], device='cuda:0')
Describe the bug
When a space is defined within a rule via
" "
it does absolutely nothingIs there an existing issue for this?
Reproduction
Just simply add a space to a rule.
Models I've tried it on:
Stheno 3.2
,Lunaris v1
GBNF I've used:
Screenshot
Logs
System Info