agungprasetyosakti / ahastar

Automatically exported from code.google.com/p/ahastar
0 stars 0 forks source link

Optimal sized clusters #23

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
The performance of OHA* depends on the effectiveness of the map
decomposition. If large empty rectangles can be identified we expand much
fewer nodes and obtain bigger savings (vs. A*) than if small rectangles are
found.

In that spirit, we need an algorithm for decomposing a grid map which
yields optimal size rectangles. One obvious method is to use a backtracking
search where the root has as a successor each traversable tile on the map.
Each node is the origin of some empty rectangle. We select for expansion
the one that allows the largest rectangle to be built. 
This is a systematic search that guarantees an optimal solution is found.
In particular, the optimal solution is the one where each traversable tile
is assigned to a rectangle and the total number of rectangles is minimised.

We might be able to speed things up by using a branch-and-bound approach
where a lowerbound is computed for each rectangle. 
One lowerbound is to relax the problem and take the maximum size of the
rectangle that can be built if we ignore horizontal obstacles and then
veritcal obstacles.

Original issue reported on code.google.com by dhara...@gmail.com on 4 Apr 2010 at 8:19

GoogleCodeExporter commented 9 years ago
Some work was made toward this goal in November/December 2010 but it looks like 
a dead end. Preprocessed Jump Points have a similar aim to this and they work 
well. It looks unlikely convex rooms will yield better results making this 
issue irrelevant.

Original comment by dhara...@gmail.com on 7 Apr 2011 at 12:06