Python, NumPy and Matplotlib implementation from scratch of machine learning algorithms used for classification.
The training set with N elements is defined as D={(X1, y1), . . ., (XN, yN)}, where X is a vector and y={0, 1} is one-hot encoded.
Sample code at the end of each file.
The Neural_Network_Derivatives.pdf document contains calculation of the derivatives used in code, except for the logistic regression that uses Yaoliang Yu lecture notes - see reference.
autoencoder: Autoencoder with sigmoid activation function in 2nd and 4th layers.
cnn: Flexible architecture of Convolutional Neural Network, with sigmoid and relu activation functions. Setup the number of layers:
Note: code translated from Matlab to Python. Original code in https://github.com/ClodoaldoLima/Convolutional-Neural-Networks---Matlab
ensemble: Implementation of three ensemble methods of neural networks:
mixture of experts: Two setups of mixture of experts for time series:
neuralnets: Single Layer Perceptron (SLP) and Multi Layer Perceptron (MLP).
optimization: first and second order methods used in machine learning backpropagation. Methods available:
regression: implementation of three setups of regression for classification:
svm: implementation of three models of Support Vector Machines for binary and multi-class classification.
Kernel types:
I would like to acknowledge professor Clodoaldo A. Moraes Lima for his guidance and support during the machine learning course at Universidade de Sao Paulo.