-
Issue Description
Problem: When using the mix generate text command with verbose set to false, and the following parameters:
Temperature: 0.1 or 0
Top p: 1
The LLM models seem to hallucinate mor…
-
Title.
https://blog.streamlit.io/ai21_grounded_multi_doc_q-a/
https://arxiv.org/html/2404.09129v1
https://arxiv.org/pdf/2406.02543
https://huggingface.co/papers/2406.02543
https://huggingface…
-
Hello! This is a great repository, thank you very much @sanchit-gandhi!
We would like to use this repository in our system, but quite a few of our Word-Error Rate (WER) regression tests fail when …
-
### feature
Hello just wanted to say that the model works great in general although it seems to have an issue with visual and textual hallucinations for example if i ask the model what color is the c…
-
### What is the issue?
Seems like something is wrong with InternLM2.5, I can't get any meaningful out of it. (tried with 32k context)
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama vers…
-
I followed these steps with a local garak repo:
1. git pull
2. gh pr checkout 851 (a PR involving a new plugin class in an existing plugin module)
3. python -m garak -m test -p packagehallucation.R…
-
It will cooll to show hallucinations at overdose by preparates or high pain. For example: create locally near the player batterflys, cows, dogs and even enemy soldiers.
It will be realistic and totall…
-
A few users have reported they were able to cause hallucination by asking for a single line of info from a collection.
Minimal:
When Concierge retrieves fewer than 5 references, maybe warn that th…
-
> From 2023/10/11 meeting https://g0v.hackmd.io/t9ypB87SQBuMjjW_PheZVg#Comm-AI-transcript
The current implementation for speech-to-text (based on Whisper API) suffers from hallucination problems. S…
-
First of all: I like your approach. It has potential, as it seems to me!
Used hallucination threshold of 1, but the value does not really matter to show the problem the approach has:
Test with …