Welcome to the DLPlan-library.
Figure 1: illustration of the core functionality of DLPlan - evaluating a domain-general state feature based on description logics with planning extensions on a given first-order planning state.
We consider a set of predicates where each predicate has the form p\n where p is a name and n is the number of arguments coming from a set of objects. A unary predicate takes exactly one argument and a binary predicates takes exactly two arguments. An atom is a predicate p where each argument takes one of the objects. A state is a set of atoms.
We consider a set of classical planning instances Q = {P_1, P_2, ..., P_n} where each P in Q consists of a set of states over a common state language.
There are two types of objects in description logics: concepts and roles. A Concept C is an additional unary predicate C\1 a role R is an additional binary predicate R\2. There are several base grammar rules and inductive grammar rules in description logics. Their interpretation on the states yield sets of atoms over the unary (resp. binary) predicates. Counting the number of ground atoms yield the valuation for a numerical feature n : S -> {0,1,...} or checking whether there exists a least one ground atoms yields a Boolean feature b : S -> {0,1}. Since we assume a common state language for all planning instances in Q, we can evaluate the features on any given state from any planning instane P in Q.
The library consists of five components. Each component has its own public header file, examples, tests, and python bindings.
The core component provides functionality for the construction an evaluation of domain-general state features based on description logics.
The generator component provides functionality for automatically generating a set of domain-general state features that are distinguishable on a given finite set of states.
The policy component implements the general policy language.
The state space provides functionality for generating state spaces from PDDL.
The novelty component provides functionality for width-based planning and learning.
DLPlan depends a fraction of Boost's header-only libraries (Fusion, Spirit x3, Container), its performance benchmarking framework depends on GoogleBenchmark, its testing framework depends on GoogleTest, its Python bindings depends on pybind11.
We provide a CMake Superbuild project that takes care of downloading, building, and installing all dependencies.
# Configure dependencies
cmake -S dependencies -B dependencies/build -DCMAKE_INSTALL_PREFIX=dependencies/installs
# Build and install dependencies
cmake --build dependencies/build -j16
Create python virtual environment and install dependencies
python3 -m venv --prompt dlplan .venv
source .venv/bin/activate
pip install state_space_generator
Run the following from the project root to build the library.
By default, the library compiles in Debug
mode.
# Configure with installation prefixes of all dependencies
cmake -S . -B build -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=${PWD}/dependencies/installs
# Build
cmake --build build -j16
# Install (optional)
cmake --install build --prefix=<path/to/installation-directory>
To use DLPlan in other cmake projects, add the following in the root CMakeLists.txt
list(APPEND CMAKE_PREFIX_PATH "<path/to/dlplan_install_dir>")
find_package(dlplan 0.1 REQUIRED COMPONENTS core generator policy statespace novelty serialization)
-DBUILD_TESTS:BOOL=TRUE
enables compilation of testsUse the following command, if you are only interested in using the Python interface, which takes care of building and installing the Python bindings and all dependencies.
pip install dlplan
The subdirectory examples/
contains a number of helpful examples that illustrate different use cases of this library.
You can run the C++ examples with
./build/examples/core/core_example
./build/examples/generator/generator_example
The Python bindings also come with the same examples. Run them with
python3 examples/core/core.py
python3 examples/generator/generator.py
You can run the C++ tests with:
cd build/tests && ctest
The Python bindings also come with their own set of tests. Run them with
python3 -m pytest api/python/
In the experiments/
directory, we provide code to profile parts of the library.
We created a DOI on Zenodo under this link. A BibTeX entry can look like this:
@software{drexler-et-al-dlplan2022,
author = {Drexler, Dominik and
Francès, Guillem and
Seipp, Jendrik},
title = {{DLPlan}},
year = 2022,
publisher = {Zenodo},
doi = {10.5281/zenodo.5826139},
url = {https://doi.org/10.5281/zenodo.5826139}
}