keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
62.05k stars 19.48k forks source link

Is there RBM in Keras? #461

Closed grgsolymosi closed 7 years ago

grgsolymosi commented 9 years ago

Recently I look for RBM in Keras but with no success so far? Is there somewhere in Keras?

fchollet commented 9 years ago

No, there is no Keras implementation of RBM that I know of.

Out of curiosity, what source / material got you interested in using RBMs? For what purpose?

wuaalb commented 9 years ago

I have a RBM and GB-RBM implementation in Keras; it was pretty straight forward to extend Keras, but they do feel a little out of place.. I'll see if I can put them in a github repository soon.

In my case I used them to implement a state-of-the-art method of converting speech of one speaker to sound like that of another; this probabilistic approach supposedly works better than directly optimizing the MSE loss of a DNN, possibly because MSE is not a good perceptual measure for speech.

lukedeo commented 9 years ago

I would definitely be interested -- I work in a field where having a generative model is extremely useful.

grgsolymosi commented 9 years ago

@fchollet I need them as a part of DBN.

@wuaalb Why would be that out of place? Base of my current knowledge RBM-s as a unit of a DBN is particular powerful for certain problem excepting visual tasks. Or I missed something? :)

fchollet commented 9 years ago

Would be interested to check out the code as well...

grgsolymosi commented 9 years ago

@fchollet Would be interested to check out the code as well... It's for me or @wuaalb?

wuaalb commented 9 years ago

Ok, I've uploaded the code to https://github.com/wuaalb/keras_extensions I haven't tested this much, but hopefully it works..

I'll briefly try to describe how it is implemented..

There's a RBM Layer which is special in that it does not have an output and thus cannot be stacked. Instead, it is trained with a special SingleLayerUnsupervised Model, that can only contain a single Layer and uses a loss function with form loss(X) rather than loss(y_pred, y_target). This loss function is actually part of the RBM Layer, in this case implementing the contrastive divergence loss using the RBM's model parameters. After training the RBM Layer can be converted to Dense Layers; one to go from visible to hidden and one to go from hidden to visible.

@Temmplar What I meant by "out of place" was that, as you may have noticed by now, Keras was designed mostly for supervised learning so trying to jam an RBM in there feels a little awkward.

There's also an attempt to have additional monitoring print outs during training (reconstruction loss, etc.), using special Callbacks, but it is pretty messy so far. Maybe there's a better way.

I guess it should be possible to train a DBN layer-by-layer, but I've never done this.

simonzack commented 9 years ago

:+1:, I'm not that familiar with DBNs, but wish to use it under keras in the future. Adding it will make it a viable alternative to pylearn2 (I'm liking keras more so far due to it's modularity). If I ever get to implementing DBNs myself I'll see if I can make it fit into keras.

stes commented 8 years ago

Is anyone working on this? I'm using RBMs/DBNs in my current research (also playing with code from @wuaalb ) and could try to integrate it into the keras framework, if unsupervised learning is a wanted feature.

metatl commented 8 years ago

@stes I am trying to use @wuaalb 's code and would definitely think it would be good to build a DBN or other unsupervised learning methods.

I always find Andrew Ng's deep sparse autoencoders powerful, but haven't been able to build/grab codes.

bobobo1618 commented 8 years ago

+1 for RBM/DBN. There's a complete implementation for Theano in the form of morb if that helps.

Lan131 commented 8 years ago

I would also like to see this.

lerynth commented 8 years ago

@fchollet, me and colleagues found that DBN is quite powerful feature extractor for our EEG-based sleep analysis. We certainly vote up +1 for RBM/DBN and other unsupervised learning in Keras.

Elgins commented 8 years ago

I am struggling use theano to implement DBNs, wish keras add this feature.

sxsxsx commented 8 years ago

Fchollet and other contributors .thank you so much for what you have done. Keras is quite helpful to me.I am very excited and interested in it. But I'm new to deep learning It is significant to use deep learning for nonlinear system identification,such as state feature extraction and prediction.I would like to research how to use the deep learning model in the water distribution system for prediction about the Measuring point pressure or flow according to the historical and current data. I am tring to use RNN、LSTM 、DBN to identify and predict. could you give me some suggestions?,eager to your reply

Elgins commented 8 years ago

@sxsxsx Hi, sxsxsx, I can help you ,let me add your facebook.

sxsxsx commented 8 years ago

@Elgins It is my pleasure to receive your reply, my facebook: 284552720@qq.com ,it is alos my QQ mailbox .look forward to your guidance

QinglinDong commented 7 years ago

Is anyone still working on keras wrapped RBM? I'm also in great need of this feature. Currently I do have https://github.com/nitishsrivastava/deepnet/tree/master/deepnet http://deeplearning.net/tutorial/rbm.html to play with, but I'd really appreciate it if we had it in keras. Also, besides RBM, there are lots of unsupervised network that we may add it to keras.

tursunwali commented 7 years ago

@sxsxsx :" am tring to use RNN、LSTM 、DBN to identify and predict." @Elgins :"I can help you ,let me add your facebook." Elgins, I want to use those in my research too. Could you give me same help ! please My facebook : lordrisborik@gmail.com

stale[bot] commented 7 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.

mathemaphysics commented 4 years ago

"What got you interested in RBMs?" Easy. It's the closest neural network there is to autonomous animal brains. And contrastive divergence learning (via KL-divergence) is a natural approximation to unsupervised learning of a distribution.

https://github.com/mathemaphysics/neural is a cheap C implementation of an RBM/DBN (depending on how you run it) using KL-divergence via Gibbs sampling Monte Carlo style. I added a cheap X11 (and also an ncurses one) for cheap visualization.