In this work, we introduce Boolformer, the first Transformer architecturetrained to perform end-to-end symbolic regression of Boolean functions. First,we show that it can predict compact formulas for complex functions which werenot seen during training, when provided a clean truth table. Then, wedemonstrate its ability to find approximate expressions when providedincomplete and noisy observations. We evaluate the Boolformer on a broad set ofreal-world binary classification datasets, demonstrating its potential as aninterpretable alternative to classic machine learning methods. Finally, weapply it to the widespread task of modelling the dynamics of gene regulatorynetworks. Using a recent benchmark, we show that Boolformer is competitive withstate-of-the art genetic algorithms with a speedup of several orders ofmagnitude. Our code and models are available publicly.
URL
Affiliations
Abstract
Translation (by gpt-3.5-turbo)
Summary (by gpt-3.5-turbo)