drawbridge / keras-mmoe

A TensorFlow Keras implementation of "Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts" (KDD 2018)
MIT License
681 stars 217 forks source link
data-science deep-learning deep-neural-networks kdd2018 keras machine-learning mixture-of-experts multi-task-learning tensorflow

Keras-MMoE

This repo contains the implementation of Multi-gate Mixture-of-Experts model in TensorFlow Keras.

Here's the video explanation of the paper by the authors.

The repository includes:

The code is documented and designed to be extended relatively easy. If you plan on using this in your work, please consider citing this repository (BibTeX is included below) and also the paper.

Getting Started

Requirements

Installation

  1. Clone the repository
  2. Install dependencies
    pip install -r requirements.txt
  3. Run the example code
    python census_income_demo.py

Notes

Contributing

Contributions to this repository are welcome. Examples of things you can contribute:

Citation

Use this BibTeX to cite the repository:

@misc{keras_mmoe_2018,
  title={Multi-gate Mixture-of-Experts model in Keras and TensorFlow},
  author={Deng, Alvin},
  year={2018},
  publisher={Github},
  journal={GitHub repository},
  howpublished={\url{https://github.com/drawbridge/keras-mmoe}},
}

Acknowledgments

The code is built upon the work by Emin Orhan.