jkaardal / matlab-convolutional-autoencoder

Cost function and cost gradient function for a convolutional autoencoder.
MIT License
6 stars 10 forks source link

Is there a demo to use this code? #2

Closed chaiein closed 5 years ago

chaiein commented 5 years ago

Its flexible if sample demo and how to run the code are provided.

jkaardal commented 5 years ago

Hi @chaiein, there is an example of how to use the code here. I would recommend only using this code for educational reasons (e.g. if you are interested in the construction of the gradient/backpropagation algorithm from scratch). If you are looking for software to use in a professional capacity, you may be better served by the usual deep learning frameworks like tensorflow/torch if any of these are likely to cause issues for your particular application:

  1. The gradient calculation is quite suboptimal in terms of compute time. I have some similar code (though suited to a slightly different purpose) where I went to great lengths to vectorize the gradient calculation and it ended up being about 10x faster than this code so there are plenty of gains to be made through vectorizing this.
  2. The graph structure of this neural network is inflexible. Only the first layer is a convolution layer while the (optional) additional hidden layers are all fully connected.
  3. With the exception of the output layer, which is linear, all of the hidden layers are logistic functions. This was originally written at a time when logistic units were still common in neural networks but the field has moved away from sigmoidal units (except in cases where "squashing" is necessary) in favor of rectified linear units due to the issue sigmoids have with vanishing gradients.
  4. The cost function is fixed to least squares.

If none of these issues are a problem for your application, then certainly use this code. The code is correct, it just is not flexible and a bit slow.