google-research-datasets / xsum_hallucination_annotations

Faithfulness and factuality annotations of XSum summaries from our paper "On Faithfulness and Factuality in Abstractive Summarization" (https://www.aclweb.org/anthology/2020.acl-main.173.pdf).
80 stars 6 forks source link

Question about Factual Hallucinations #7

Open ziweiji opened 2 years ago

ziweiji commented 2 years ago

In your paper "On Faithfulness and Factuality in Abstractive Summarization":

Factual hallucinations may be composed of intrinsic hallucinations or extrinsic hallucinations.

I cannot understand the case of “intrinsic factual hallucination”. How come the hallucination contradicts to the document and is factually correct at the same time? In that case, the document is erroneous.

bjhaneline21291 commented 2 years ago

The reason is the data actual came from a an artificial intelligence brain based off a real human being. Even though its possible to run situations simultaneously trying to get the perfect result. The algorithm used for the factual hallucination is based on raw data no matter how you look at it. The data will be correct on and off at the same time. Data along with how 3rd person view looks at it but the contradiction came from the raw data from AI brain. The one thing that can't be meshed out in numbers or data is how common sense evolves in a human being. To us it is correct with a slight debug issue, to the AI brain its complete chaos. Which means the cantankerous person who pulled the raw data actually sees it as being correct and incorrect at the same time. Neuroscience states being neural networked to that individual or AI brain means the brain is going two different directions at once being forced to believe 2 things. Us tech guys think we can find a solution, but remember common sense not only can't be predicted in the stages of evolvment throughout life but even with future prediction, you can predict it but there is 0 conclusions that will be conclusive enough for raw data in common sense

shashiongithub commented 2 years ago

Here is an example of intrinsic factual hallucination:

Let's say the document says: "Prime Minister X and his wife Y visited Scotland and met the Queen. ..."

Summary: "Scottish Y was delighted to meet the queen."

Here, we highlight "Scottish Y" as an intrinsic hallucination as it uses concepts "Scotland" and the name Y from the document, but the document does not confirm if Y is Scottish. Assuming that Y is actually Scottish, "Scottish Y" can be factual. Hence, in this case "Scottish Y" will be a case of “intrinsic factual hallucination”.