-
Looking at the data for `4 bits per tensor` in the results table for optimizing OPT model, the values for `c4 Perplexity` and `wikitext2 Perplexity` seem unusually high compared to the other configura…
-
### Version
0.0.5.9
### Describe the bug
As in title, every dozen prompts, once becomes completely stuck on 'Researching'. For every prompt, I refresh Perplexity and start on home page.
### Steps…
-
Hey Riri,
Is there any chance to add the Perplexity AI Search Engine to this addon?
-
@kongzii says "give it both Predict and Tavily tools"
@evangriffiths says "give one agent just Predict and one just Tavily, and deploy both, and compare
@gabrielfior says "don't care, just get o…
-
Hello, when using the extension with Perplexity, it clears the original question you asked and then pastes a completely irrelevant memory.
I am fine with the irrelevant memory but the question I ask …
-
Hi, can you please include perplexity evaluation from llama.cpp to koboldcpp? There is a separate script for that called perplexity in llama.cpp. Currently looks like this script is not present comple…
-
Hi team,
I was fine tuning an LLM with Ludwig on a **NVIDIA A 100** instance.
I get the error message - **Encounted `nan` values in tensor. Will be removed.", UserWarning)** My loss and perplexi…
-
Now in the harness, there are metrics called "byte_perplexity" and "word_perplexity". These two metrics normalize the perplexity by the length of characters and words, respectively. If we want to norm…
-
```
I'm trying to evaluate 5-gram model on a Vietnamese corpus but the perplexity
doesn't seem to be right...
What steps will reproduce the problem?
1. Download and extract problem.zip
2. Follow th…
-
```
I'm trying to evaluate 5-gram model on a Vietnamese corpus but the perplexity
doesn't seem to be right...
What steps will reproduce the problem?
1. Download and extract problem.zip
2. Follow th…