My personal notes on Machine Learning and other random topics
Last edited: 2024-04-23
This repo contains my collection of materials and random notes I take while researching and playing with Machine Learning (ML) and Artificial Neural Networks (NN). It includes topics such as numerical methods, differential calculus, artificial neural networks, libraries, implementations, materials that I ended up using during my research, and other random topics that I also found interesting. It is a work in progress and subject to constant change.
|
This repo is permanently under construction, so its content changes constantly. |
Contents
Some materials are in the form of simple files and others are organized in subdirectories (unordered list):
Horovod
Horovod was created internally at Uber to make it easy to use a single-GPU training script and successfully scale it to train on many GPUs in parallel.
The horovod directory contains some Notebooks with examples:
Some info available in my other repos
-
In My MSc repo dedicated to my master's thesis, I trained a convolutional NN :
-
In CAP-351 course notes I made these Notebooks :
- project1-mlp.ipynb - Multilayer Perceptron (MLP) is a fully connected class of feed-forward artificial neural network (NN).
- project2-som.ipynb - a self-organizing map or self-organizing feature map is an unsupervised machine learning technique used to produce a low-dimensional representation of a higher dimensional data set while preserving the topological structure of the data.
- project3-vae.ipynb - in machine learning, a variational auto-encoder, is an artificial neural network architecture introduced by Diederik P. Kingma and Max Welling, belonging to the families of probabilistic graphical models and variational Bayesian methods.
- project4-cnn.ipynb - a Convolutional Neural Network (CNN, or ConvNet) is a class of artificial neural network (NN), most commonly applied to analyze visual imagery.
- project5-rnn.ipynb - a Recurrent Neural Network (RNN) is a class of artificial neural networks where connections between nodes can create a cycle, allowing output from some nodes to affect subsequent input to the same nodes.