aiplan4eu / unified-planning

The AIPlan4EU Unified Planning Library
Apache License 2.0
177 stars 39 forks source link

Several issues when doing `copy.deepcopy` on a parsed PDDL problem #613

Open scastro-bdai opened 2 months ago

scastro-bdai commented 2 months ago

Describe the bug I have a use case in which I want to parse a PDDL domain file without a problem file, and I will be adding the problem information (object, initial conditions, and goals) at runtime programmatically.

While this works, I was relying on doing the parsing only once and every time I plan, I "deep copy" the incomplete problem and make my modifications before planning. In doing so, I discovered a whole range of issues that I wanted to share with the authors here.

To Reproduce I have attached the complete files, but here is the general gist. We first parse the domain, no problem.

reader = PDDLReader()
orig_problem = reader.parse_problem("test_domain.pddl")

Now we'll make a deepcopy

import copy
problem = copy.deepcopy(orig_problem)

ISSUE 1: When you create a new object using a user type from the copied problem, the environment doesn't cleanly copy over.

robby = Object("robby", problem.user_type("robot"))

#   File "/usr/local/lib/python3.10/dist-packages/unified_planning/model/object.py", line 46, in __init__
#    assert self._env.type_manager.has_type(
#    AssertionError: type of the object does not belong to the same environment of the object    

This can be worked around by using the third argument:

robby = Object("robby", problem.user_type("robot"), problem.environment)

ISSUE 2: When adding a negated goal, you get an error. For example:

problem.add_goal(Not(robot_at(robby, kitchen)))

#   File "/usr/local/lib/python3.10/dist-packages/unified_planning/model/expression.py", line 165, in auto_promote
#    e.environment == self.environment
#    AssertionError: Expression has a different environment of the expression manager

I found I could try get around this with ugly internal syntax... but that led me into another error

negated_goal = problem.environment._expression_manager.Not(robot_at(robby, kitchen))

# *** unified_planning.exceptions.UPTypeError: The expression '(not robot_at(robby, kitchen))' is not well-formed

ISSUE 3: This is directly related to the above issue. If additionally to goals, you already have an action with a precondition/effect that is similarly negated, that will come up regardless when running the planner.

  File "/usr/local/lib/python3.10/dist-packages/unified_planning/model/walkers/type_checker.py", line 45, in get_type
    raise UPTypeError(
unified_planning.exceptions.UPTypeError: The expression '(not robot_at(r, loc1))' is not well-formed

Expected behavior Problem modification and planning should still work if the Problem object is deep copied.

Desktop (please complete the following information):

scastro-bdai commented 2 months ago

Oh, and here are the attached example files for you to reproduce if you'd like.

up_repro.zip

alvalentini commented 1 month ago

Hello @scastro-bdai! Thanks for opening the issue and for using the unified-planning library.

At the moment the deepcopy is not correctly supported, but the Problem class has a clone method that should work fine for your use case. Did you try with it?

scastro-bdai commented 1 month ago

Thank you @alvalentini -- I did not notice that method. Just tested and indeed it seems to work for me perfectly.

Not sure if you'd still like to keep this issue open or closed, as there is a viable workaround... maybe a good solution would be documenting this workflow somewhere through an example that loads a PDDL domain and plans multiple times over different problem specifications?

alvalentini commented 1 month ago

It is a good idea, thanks!

alvalentini commented 1 month ago

Actually, here we have a similar example.

scastro-bdai commented 1 month ago

Actually, here we have a similar example.

Yep, that's exactly what I am doing! If I had read that example I would have been all set.

Maybe it's just good that we have this issue so people who similarly try to deepcopy at least get something when they search.