apache / mxnet

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
https://mxnet.apache.org
Apache License 2.0
20.78k stars 6.79k forks source link

Implement the ICCV 2015 best paper "Deep Neural Decision Forests" #1083

Closed futurely closed 8 years ago

futurely commented 8 years ago

The creators of Deep Neural Decision Forests [1] implemented their networks in DMLC/CXXNET which has been superseded by this project. It is much more efficient to estimate the leaf node probability distributions with a large mini-batch instead of the whole training set. One of the major advantages of CXXNET brought to their proposed dNDF.NET was distributed training in which effective mini-batch size was the single node mini-batch size multiplied with the number of cluster nodes and at the same time training was much faster than on a single node.

By far, the only attempt at implementing the paper on GitHub is based on Theano and incomplete.

[1] P. Kontschieder, M. Fiterau, A. Criminisi, and S. Rota Bulo'. Deep Neural Decision Forests. ICCV 2015.

SkidanovAlex commented 8 years ago

I built neural decision forests in Lasagne (https://github.com/SkidanovAlex/ShallowNeuralDecisionForest), where I actually managed to evaluate leaf probabilities in a vector form, allowing the entire model to be executed on a GPU. I was, however, never able to get as good results as theirs (it was on par with a regular NN on all the tasks I tried it, but was taking longer to converge). They never responded to my email to them with request for more details, so I don't know what exactly I am doing different.

piiswrong commented 8 years ago

I think they had the order wrong: decision forests are good at handling discrete data. Putting a DNN under them takes that away and it's just another DNN ensemble. You should put DNNs on top of decision forests.

winstywang commented 8 years ago

Close for now since nobody seems is interested in implementing this feature. Welcome to reopen if have updates.