slds-lmu / lecture_i2dl

Introduction to Deep Leaning
Creative Commons Attribution 4.0 International
2 stars 0 forks source link

Introduction to Deep Learning

The repository latex-math is used. Please read its ReadMe here: https://github.com/compstat-lmu/latex-math

General rules

Please observe the following rules when creating and editing the lecture and exercise slides:

Setup

  1. Clone this repository
  2. Clone the latex-math repository into the main directory of this repository
  3. Navigate to a folder where the slideset is contained, e.g. 2020/01-introduction
  4. If there is Makefile in the folder: do make -f "Makefile" unless render slides by knitr::knit2pdf("slides.Rnw")

Structure

Topics will be added or remade, the normal ones are already in the slides:

  1. Introduction, Overview, and a Brief History of Deep Learning

  2. Deep Feed-Forward Neural Networks, Gradient Descent, Backpropagation, Hardware and Software

  3. Regularization of Neural Networks, Early Stopping

  4. Dropout and Challenges in Optimization

  5. Advances in Optimization

  6. Activation Functions and Initialization

  7. Convolutional Neural Networks, Variants of CNNs, Applications

  8. Modern CNNs and Overview of some Applications

  9. Recurrent Neural Networks

  10. Modern RNNs and Applications

  11. Deep Unsupervised Learning

  12. Autoencoders, AE Regularization and Variants

  13. Manifold Learning

  14. Deep Generative Models, VAE, GANs

Math and formula

  1. Math environments within a text line are created by $ environment, separate equation lines are created by $$ environment

  2. The abbreviations defined in the header file should always be used within the code for simplification purposes

  3. The repo latex-math is used. Please read the corresponding ReadMe: https://github.com/compstat-lmu/latex-math

Material Deep Learning

Extra material to have a look at:

Good Websites to have a look

Optimization / Training of NNs:

Regularization:

CNNs:

Autoencoders

Variational Autoencoders

Reinforcement Learning

LSTMs:

Hyperparameter Optimization / Neural Architecture Search / etc:

Software / Languages / etch

Nice Demos and Vizualisations

Material for Exercises

External material

  1. other
    • PyData-2017-TF_TFS: Slides for Spark + Tensorflow + Notebooks