ForoughA / CORGI

Conversational Neuro-Symbolic Commonsense Reasoning
https://arxiv.org/abs/2006.10022
Other
25 stars 3 forks source link

Proofs and Theorem Prover #1

Open AdamIshay opened 4 years ago

AdamIshay commented 4 years ago

Hello,

When I run the reasoning.py file, the acceptedProof is always empty, is this expected? How do I see generated proofs?

Additionally, I don't see the neural theorem prover learning in reasoning.py, I am wondering if I am missing something or maybe there is another instruction to run the theorem prover learning.

Thank you!

ForoughA commented 4 years ago

Hello!

The accepted proof will be non-empty only when you successfully fill CORGI's missing knowledge through user dialog. To better help, can you tell me which statement you tried and what your responses to CORGI's prompts were.

You are right thanks for bringing this up! It looks like I only released the pre-trained model. I will include the training scripts and ping you when the code is up.

Thanks!

AdamIshay commented 4 years ago

Thanks for your response. Here are four statements that I responded to:

Statement: If I have an early morning meeting then wake me up early because I want to be ontime. Prompt: How do I know ‘I be ontime’? User answer: If I wake up early Output: Empty proof Sorry, I do not know how to do what you asked me :(

Statement: If there are thunderstorms in the forecast within a few hours then remind me to close the windows because I want to keep my home dry. Prompt: How do I know if “I keep my home dry.”? User answer: If I close the windows Output: Empty proof Sorry, I do not know how to do what you asked me :(

Statement: If I schedule an appointment that overlaps with another appointment then notify me immediately because I want to let my colleagues know of the conflict. Prompt: How do I know if “I let my colleagues know of the conflict.”? Answer: If I email them Output: Empty proof Sorry, I do not know how to do what you asked me :(

Statement: If I search for a gas station in the navigation app and there is a cheaper gas station that is not too much further away then ask me immediately whether I want to switch the destination to the new gas station because I want to save money. Prompt: How do I know if “I save money.”? User answer: If I switch my destination to a cheaper gas station Prompt: How do I know if “If I switch my destination to a cheaper gas station”? User answer: If I am asked whether I want to switch to a cheaper gas station Prompt: How do I know if “If I am asked whether I switch to a cheaper gas station”? User answer: If there is a cheaper gas station that is not too much further away Empty proof Sorry, I do not know how to do what you asked me :(

ForoughA commented 4 years ago

Hello and sorry for the slow response! For the first utterance, can you try this answer: "if i wake up at 8" ? This should produce a proof given the current knowledge base. If it does not, let me know and I will help to fix that.

In general (if there are no installation issues or files missing), being able to generate a proof or not depends on whether the user interacting with the reasoning engine is able to give the engine the knowledge needed for completing a proof. Another factor that affects the reasoning engine's capability in generating a proof is how complete the knowledge base is in the first place. The current knowledge base is intentionally sparse and has a lot of missing knowledge. This is because we would like to see if it is feasible to rely on human interaction for commonsense reasoning. Let me know if this description is not clear.

jainkhyati commented 4 years ago

Hello,

I got the same results as @AdamIshay . As you suggested answering "if i wake up at 8" to the first question does complete the proof. But I was wondering If I were doing something wrong when I got the following responses:


statement : If there are thunderstorms in the forecast within a few hours then remind me to close the windows because I want to keep my home dry. Prompt: How do I know if "I keep my home dry."? User: If the windows are closed matchedGoal2: close_new(the_windows)

0 : sorry, i do not know how to do what you asked me :( -- statement: If it's going to rain in the afternoon then remind me to bring an umbrella because I want to remain dry. Prompt: How do I know if "I remain dry."? User: If I have my umbrella matchedGoal2: have(i,my_umbrella) 0 : sorry, i do not know how to do what you asked me :(

Here, in the two cases above, I am giving a statement that directly maps the goal to the query and is an example from the paper. CORGI is able to "match the goal". So, the proof should have been completed. But it is failing in both cases.

Frequently I also noticed, CORGI doesn't ask me a second query on the subgoal but fails after it is unable to complete the proof using the first input. What could be the reason for that?

Thank you so much!