vgvassilev / clad

clad -- automatic differentiation for C/C++
GNU Lesser General Public License v3.0
280 stars 123 forks source link

Regression in `clad` master in dealing with templated functions #922

Closed guitargeek closed 3 months ago

guitargeek commented 3 months ago

This was noticed when running the RooFit unit tests with clad master.

Reproducer (ROOT macro, but should be easy to turn into compiled executable):

template <bool pdfMode>
inline double polynomial(double const *coeffs, int nCoeffs, int lowestOrder, double x)
{
   double retVal = coeffs[nCoeffs - 1];
   for (int i = nCoeffs - 2; i >= 0; i--)
      retVal = coeffs[i] + x * retVal;
   retVal = retVal * std::pow(x, lowestOrder);
   return retVal + (pdfMode && lowestOrder > 0 ? 1.0 : 0.0);
}

double roo_func_wrapper_4(double *params)
{
   double t4[] = {params[0], params[1], 1.};
   const double t5 = polynomial<false>(t4, 3, 0, 1.);
   return t5;
}
#include <Math/CladDerivator.h>

#pragma clad ON
void roo_func_wrapper_4_req()
{
   clad::gradient(roo_func_wrapper_4, "params");
}
#pragma clad OFF

void reproducer()
{
   std::vector<double> parametersVec = {-0.5, -0.5, 0.5};

   std::vector<double> gradientVec(parametersVec.size());

   auto wrapper = [&](double *params) { return roo_func_wrapper_4(params); };

   std::cout << roo_func_wrapper_4(parametersVec.data()) << std::endl;
   roo_func_wrapper_4_grad(parametersVec.data(), gradientVec.data());

   std::cout << "Clad diff:" << std::endl;
   std::cout << gradientVec[0] << std::endl;
   std::cout << gradientVec[1] << std::endl;
   std::cout << gradientVec[2] << std::endl;

   auto numDiff = [&](int i) {
      const double eps = 1e-6;
      std::vector<double> p{parametersVec};
      p[i] = parametersVec[i] - eps;
      double nllValDown = wrapper(p.data());
      p[i] = parametersVec[i] + eps;
      double nllValUp = wrapper(p.data());
      return (nllValUp - nllValDown) / (2 * eps);
   };

   std::cout << "Num diff:" << std::endl;
   std::cout << numDiff(0) << std::endl;
   std::cout << numDiff(1) << std::endl;
   std::cout << numDiff(2) << std::endl;
}

Output:

0
Clad diff:
0
0
0
Num diff:
1
1
0
vgvassilev commented 3 months ago

@guitargeek, do you have an idea which commit broke it? There are 17 or so since the tag...

vaithak commented 3 months ago

@PetroZarytskyi I just checked that this got introduced in https://github.com/vgvassilev/clad/pull/904. I have created a very minimal reproducer below. Can you look into this once?

#include "clad/Differentiator/Differentiator.h"

double f(double x) {
  return x + (x > 0 ? 1.0 : 0.0);
}

int main() {
  auto f_dx = clad::gradient(f);
  double dx = 0;
  f_dx.execute(3, &dx);
  std::cout << dx << std::endl;
  return 0;
}