Closed echo66 closed 6 years ago
Thanks for your interest! We'll be releasing a preprint that describes Feat in the coming days. The basic differences are
Feat is definitely harder to install and we haven't made an official release yet, so you might want to start with Few and go from there. I'll keep this thread updated once we get some tangible empirical comparisons between the two.
Gradient descent??? I'm probably a "little bit" behind the state of the art regarding evolutionary approaches but...doesn't gradient descent require your function to be differentiable? Evolutionary approaches are not required to be differentiable, right?
Feat will learn the constants for the subset of features that are differentiable using gradient descent. It is a local search built within the larger search for feature forms.
Hey @lacava !
Really interesting idea! Thanks or the explanation!
here is the arxiv preprint I mentioned.
Greetings!
I would like to know if there is any practical difference between the two projects. I'm asking this because testing feat would require a lot more effort than few and, as such, I need to know if it is worth it.
Thanks in advance!