eddelbuettel / rcppfastad

Rcpp Bindings to FastAD Automatic Differentiation
8 stars 0 forks source link

Linear Regression Possible Improvements #1

Closed JamesYang007 closed 1 year ago

JamesYang007 commented 1 year ago

Thanks again for making this!!

Just had a few things about the linear regression example.

Maybe a simpler implementation to replace: https://github.com/eddelbuettel/rcppfastad/blob/dbcbcd17073f620de7cb646b5df42a853d6f2de6/src/linear_regression.cpp#L23-L49 with

    const auto p = theta_hat.size();
    Eigen::VectorXd theta_adj(p);
    theta_adj.setZero(); // Set adjoints to zeros.

    // Initialize variable.
    ad::VarView<double, ad::vec> theta(theta_hat.data(), theta_adj.data(), p);

    // Create expr. Use a row buffer to store data. Then we only need to manipulate data when
    // looping.
    auto expr = ad::bind(ad::norm(y - ad::dot(X, theta)));

    // Differentiate and get loss
    auto loss = ad::autodiff(expr);
eddelbuettel commented 1 year ago

Love it! :heart: