ethz-asl / segmap

A map representation based on 3D segments
BSD 3-Clause "New" or "Revised" License
1.06k stars 393 forks source link

Minor - Use norm instead of squared norm for features_distance_threshold? #5

Open danieldugas opened 7 years ago

danieldugas commented 7 years ago

see here:

https://github.com/ethz-asl/segmatch/blob/d7f3da4ae40f9b24f6de38c800537ad989a4cb4a/segmatch/src/opencv_random_forest.cpp#L217

rdube commented 7 years ago

.norm() requires one square root operation more for each comparison which is why I kept it as .squaredNorm(). We can always specify the the squared norm threshold. What do you think?

HannesSommer commented 7 years ago

This sounds like over optimization too me. When you trade of simplicity, especially for the meaning of a parameter (with which a human has to operate efficiently and safely), and runtime performance go for simplicity unless it really matters. In that case I don't see it really mattering. Furthermore in this case you don't need to decide. Because as a compromise you can square the parameter into into a temporary (before the loop) and then compare that temporary with the squaredNorm. This also goes much better in line with another rule: keep optimization local if ever possible. This is because optimization creates complex and or surprising code. And this type of code must be small or it is going to be wrong :) and possibly wrong many time through the lifetime of the project.

Another thing concerning these lines: If normalize_eigen_for_hard_threshold is a true parameter (it use used both ways) then I would change this loop. Otherwise of course remove it. The problem is that the current design makes its false-case much more inefficient than it should be. Mostly because it enforces an extra malloc and free per iteration! Dynamic memory allocation quite often is a evil bottleneck. So you don't won't that per one of MANY iterations. And I assume it is MANY because of this issue.

So first: move the Eigen::MatrixXd f1 and f2 out of the for loop. (saves all mallocs except the two). Second: if it is interesting to save the extra copy as well in the false-case do the following:

      auto feature_distance_threshold_squared = std::pow(params_.feature_distance_threshold , 2);
      if (params_.normalize_eigen_for_hard_threshold) {
        Eigen::MatrixXd f1 = candidate.features1_;
        Eigen::MatrixXd f2 = candidate.features2_;
        for (const auto & candidate: candidates_after_first_stage) {
            f1 = candidate.features1_;
            f2 = candidate.features2_;
            normalizeEigenFeatures(&f1);
            normalizeEigenFeatures(&f2);
            if ((f1 - f2).squaredNorm() < feature_distance_threshold_squared) {
              candidates.push_back(candidate);
            }
          }
      } else {
        for (const auto & candidate: candidates_after_first_stage) {
          if ((candidate.features1_ - candidate.features2_).squaredNorm() < feature_distance_threshold_squared) {
            candidates.push_back(candidate);
          }
        }
      }

I know it contains some ugly duplication but it is very local but should be quite rewarding optimization wise. To make it a bit cleaner and the duplication more local: convert the bodies of the two for loops to a inlineable function.

danieldugas commented 7 years ago

As always Hannes, fantastic in-depth advice! I appreciate it.

@rdube, i'm fine with either solution. I was just pointing out this small discrepancy, so that the interface is as consistent with the underlying as possible ;)

danieldugas commented 7 years ago

Also, the following parameters originally left me confused every time I forgot about them (every two days or so, shame on me), requiring that I peek at the code:

  A: n_nearest_neighbours
  B: enable_two_stage_retrieval
  C: apply_hard_threshold_on_feature_distance

A | B | C T | T | T | -> knn then hard threshold T | T | F | -> knn then RF T | F | * | -> knn ? F | * | * | -> RF

I feel like a solution would be: knn yes/no --- with-hard-threshold yes/no RF yes/no

which leaves the two main algorithms uncoupled. knn without RF works, RF without knn works. knn+RF works.

@rdube, what's your opinion? If you're interested I can work on this when I have a few minutes :)

rdube commented 7 years ago

Thanks for your input! @HannesSommer that looks good, I'll try to change this in the near future.

@exodaniel yes this is indeed a bit confusing. I would recommend opening a separate issue if we need to discuss further. Your solution looks good. Feel free to propose a modification in that sense. We should also check to catch impossible combinations and exit. Thanks for pointing this out!