cavalab / feat

A feature engineering automation tool for learning data representations
https://cavalab.org/feat
GNU General Public License v3.0
33 stars 14 forks source link

FEAT

Build Status License: GPL v3

FEAT is a feature engineering automation tool that learns new representations of raw data to improve classifier and regressor performance. The underlying methods use Pareto optimization and evolutionary computation to search the space of possible transformations.

FEAT wraps around a user-chosen ML method and provides a set of representations that give the best performance for that method. Each individual in FEAT's population is its own data representation.

FEAT uses the Shogun C++ ML toolbox to fit models.

Check out the documentation for installation and examples.

References

  1. La Cava, W., Singh, T. R., Taggart, J., Suri, S., & Moore, J. H.. Learning concise representations for regression by evolving networks of trees. ICLR 2019. arxiv:1807.0091

  2. La Cava, W. & Moore, Jason H. (2020). Genetic programming approaches to learning fair classifiers. GECCO 2020. Best Paper Award. ACM, arXiv, experiments

  3. A flexible symbolic regression method for constructing interpretable clinical prediction models La Cava, W. G., Lee, P. C., Ajmal, I., Ding, X., Solanki, P., Cohen, J. B., Moore, J. H., & Herman, D. S. (2023). A flexible symbolic regression method for constructing interpretable clinical prediction models. Npj Digital Medicine, 6(1), 1–14. HUMIES Silver Award winner. nature, medRxiv, experiments

Contact

Maintained by William La Cava (william.lacava at childrens.harvard.edu)

Acknowledgments

Special thanks to these contributors:

This work is supported by grant K99-LM012926 and R00-LM012926 from the National Library of Medicine. FEAT is being developed to learn clinical diagnostics in the Cava Lab at Harvard Medical School.

License

GNU GPLv3, see LICENSE

Installation

To see our installation process from scratch, check out the Github actions workflow.

Dependencies

Feat uses cmake to build. It also depends on the Eigen matrix library for C++ as well as the Shogun ML library. Both come in packages on conda that should work across platforms.

Install in a Conda Environment

The easiest option for install is to use the conda environment we provide. Then the build process is the following:

git clone https://github.com/lacava/feat # clone the repo
cd feat # enter the directory
conda env create
conda activate feat
pip install .

If you want to roll your own with the dependencies, some other options are shown below. In this case, you need to tell the [configure]{.title-ref} script where Shogun and Eigen are. Edit this lines:

export SHOGUN_LIB=/your/shogun/lib/
export SHOGUN_DIR=/your/shugn/include/
export EIGEN3_INCLUDE_DIR=/your/eigen/eigen3/

If you need Eigen and Shogun and don\'t want to use conda, follow these instructions.

Eigen

Eigen is a header only package. We need Eigen 3 or greater.

Debian/Ubuntu

On Debian systems, you can grab the package:

sudo apt-get install libeigen3-dev

You can also download the headers and put them somewhere. Then you just have to tell cmake where they are with the environmental variable EIGEN3_INCLUDE_DIR. Example:

# Eigen 3.3.4
wget "http://bitbucket.org/eigen/eigen/get/3.3.4.tar.gz"
tar xzf 3.3.4.tar.gz 
mkdir eigen-3.3.4 
mv eigen-eigen*/* eigen-3.3.4
# set an environmental variable to tell cmake where Eigen is
export EIGEN3_INCLUDE_DIR="$(pwd)/eigen-3.3.4/"

Shogun

You don\'t have to compile Shogun, just download the binaries. Their install guide is good. We\'ve listed two of the options here.

Debian/Ubuntu

You can also get the Shogun packages:

sudo add-apt-repository ppa:shogun-toolbox/nightly -y
sudo apt-get update -y
sudo apt-get install -qq --force-yes --no-install-recommends libshogun18
sudo apt-get install -qq --force-yes --no-install-recommends libshogun-dev

Running the tests

(optional) If you want to run the c++ tests, you need to install Google Test. A useful guide to doing so is available here. Then you can use cmake to build the tests. From the repo root,

./configure tests   # builds the test Makefile
make -C build tests # compiles the tests
./build/tests # runs the tests

For the python tests, run

python tests/wrappertest.py

Contributing

Please follow the Github flow guidelines for contributing to this project.

In general, this is the approach: