AI-Planning / p4pp

Planning tools for planning pedagogy
MIT License
2 stars 1 forks source link

fixed bug with double negated preconditions. We now simplify the theory #6

Open nirlipo opened 2 years ago

nirlipo commented 2 years ago

This fixes the following bug:

image

into:

image

Please check if this is the correct transformation. Note that the original was negating the AND formula, so the precondition became OR of some double negated fluents, which then were handled incorrectly by the planner.

haz commented 2 years ago

Can you clarify -- was the original incorrect, or just not handled by the planner you were using?

haz commented 2 years ago

Read through the change -- not sure this works semantically. We want to negate the other precondition, and if it's a conjunction, that means indeed it turns into a disjunction. Without doing that, you're essentially saying that it's a failed action only if you can violate all of the domain2 preconditions for the action while satisfying all of the domain1 preconditions. We'd rather want to violate just one of the domain2 preconditions while satisfying the domain preconditions.

That make sense?

nirlipo commented 2 years ago

I tried to use the lor formula, but then tarksi would print it into an action embedding an or connective between each term, which defeats the purpose.

In the latest commit, I ended up adding a failed action for each precondition, by negating only one precondition at a time. This indicates exactly which precondition is triggering the error.

image

It may be that the original encoding was correct, but I couldn't understand which precondition was failing. Now I do :)

nirlipo commented 2 years ago

On a different note, many alignments fail due to students submissions being parsed well by the online solver but not by Tarski. We need to find a workaround to this exposing this as an autograder so they can make sure their model works well with Tarski too.

haz commented 2 years ago

In the latest commit, I ended up adding a failed action for each precondition, by negating only one precondition at a time. This indicates exactly which precondition is triggering the error.

Very compelling. But I'm a bit torn. To truly know what's wrong, that's the best approach. To just give students a hint (e.g., in running a validation on a webserver where they don't get to see the reference domain being used), then perhaps it's too much info. Ultimately, having the choice might get us the best of both worlds. Two requests if you're up for it:

  1. Make it a cmd-line option to go with the original mode (negate it all) -vs- the split mode (negate each precond and use multiple actions).
  2. Put in the assertions required to ensure the approach is operating only on domains that make sense. If I have an arbitrary nesting, or an or in the preconditions, or whatever, then it will fail silently (I think?). Using the original planner handles these cases, so the check only needs to be there for the "split" option.

On a different note, many alignments fail due to students submissions being parsed well by the online solver but not by Tarski. We need to find a workaround to this exposing this as an autograder so they can make sure their model works well with Tarski too.

Ya, this is a major problem. I've kind of conceded to not solving it since I need to move to a different style of assessment for the course:

nirlipo commented 2 years ago

Hosting seems like a good move to allow testing against Tarski and continuous feedback while developing their models.

haz commented 2 years ago

Big-time agree. Think you can tackle 1+2 above? I'll get it server-wrapped for us from there (will be needed early 2023).

haz commented 1 year ago

@nirlipo Any chance you'd like to make a move on the above? In the meantime, I've started picking away at the server setup...