mtewes / tenbilac

Neural network for inverse regression problems
1 stars 2 forks source link

Tenbilac

Tenbilac is a simple and exploratory feedforward neural network library that is designed to yield statistically accurate regressions despite noisy input features. Thanks to a special structure of the training data, networks can be trained to minimize bias instead of error. This is useful to solve inverse regression problems (aka "calibration problems" of regression).

Note that the present implementation is a demonstration more than an optimized library: it is based on numpy and purely numerical differentiation with scipy.optimize.

For a description of the algorithm and references, see Section 3 and Appendix A of the related paper: arXiv:1807:02120.

Some technical features of tenbilac are:

Installation

You could python setup.py install this, but given that this code is quite experimental, we recommend to simply add the location of your clone of this directory to your PYTHONPATH.

To do so, if you use bash, add this line to your .bash_profile or .profile or equivalent file:

export PYTHONPATH=${PYTHONPATH}:/path/to/tenbilac/

Directory structure

Tutorial

The documented code in demo/paper_figure serves as an example to demonstrate the basic features of tenbilac, following Appendix A of the paper. It first generates some training data in form of noisy observations d that depend on an explanatory variable theta. It then performs inverse regressions of the explanatory variable given noisy observations. By training against the Mean Square Bias (MSB) cost function the results are much more accurate than by using the conventional mean square error (MSE) or by training on noiseless data.

Demo figure

To learn about the more advanced interface (reading config files, restarting from previous trainings, etc), see the demonstration in demo/com_interface, or explore how MomentsML uses tenbilac.