RoboticsClubIITJ / ML-DL-implementation

An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib.
BSD 3-Clause "New" or "Revised" License
49 stars 69 forks source link
deep-learning hacktoberfest machine-learning matplotlib numpy nwoc python statistics woc

ML-DL-implementation

Build Status codecov Gitter Gitpod ready-to-code

Machine Learning and Deep Learning library in python using numpy and matplotlib.

Why this repository?


This repository gives beginners and newcomers in the field of AI and ML a chance to understand the inner workings of popular learning algorithms by presenting them with a simple way to analyze the implementation of ML and DL algorithms in pure python using only numpy as a backend for linear algebraic computations.

The goal of this repository is not to create the most efficient implementation but the most transparent one, so that anyone with little knowledge of the field can contribute and learn.

Installation

You can install the library by running the following command,

python3 setup.py install

For development purposes, you can use the option develop as shown below,

python3 setup.py develop

Testing

For testing your patch locally follow the steps given below,

  1. Install pytest-cov. Skip this step if you already have the package.
  2. Run, python3 -m pytest --doctest-modules --cov=./ --cov-report=html. Look for, htmlcov/index.html and open it in your browser, which will show the coverage report. Try to ensure that the coverage is not decreasing by more than 1% for your patch.

Contributing to the repository

Follow the following steps to get started with contributing to the repository.

Algorithms Implemented

Activations Location Optimizers Location Models Location Backend Location Utils Location
ACTIVATION FUNCTIONS OPTIMIZERS MODELS BACKEND PRE-PROCESSING METHODS
Sigmoid activations.py Gradient Descent optimizers.py Linear Regression models.py Autograd autograd.py Bell Curve preprocessor_utils.py
Tanh activations.py Stochastic Gradient Descent optimizers.py Logistic Regression models.py Tensor tensor.py Standard_Scaler preprocessor_utils.py
Softmax activations.py Mini Batch Gradient Descent optimizers.py Decision Tree Classifier models.py Functions functional.py MaxAbs_Scaler preprocessor_utils.py
Softsign activations.py Momentum Gradient Descent optimizers.py KNN Classifier/Regessor models.py Z_Score_Normalization preprocessor_utils.py
Relu activations.py Nesterov Accelerated Descent optimizers.py Naive Bayes models.py Mean_Normalization preprocessor_utils.py
Leaky Relu activations.py Adagrad optimizers.py Gaussian Naive Bayes models.py Min Max Normalization preprocessor_utils.py
Elu activations.py Adadelta optimizers.py Multinomial Naive Bayes models.py Feature Clipping preprocessor_utils.py
Swish activations.py Adam optimizers.py Polynomial Regression models.py
Unit Step activations.py Bernoulli Naive Bayes models.py
Random Forest Classifier models.py
K Means Clustering models.py
Divisive Clustering models.py
Agglomerative Clustering models.py
Bayes Optimization models.py
Numerical Outliers models.py
Principle Component Analysis models.py
Z_Score models.py
Sequential Neural Network models.py
Loss Functions Location Regularizer Location Metrics Location
LOSS FUNCTIONS REGULARIZER METRICS
Mean Squared Error loss_func.py L1_Regularizer regularizer.py Confusion Matrix metrics.py
Logarithmic Error loss_func.py L2_Regularizer regularizer.py Precision metrics.py
Absolute Error loss_func.py Accuracy metrics.py
Cosine Similarity loss_func.py Recall metrics.py
Log_cosh loss_func.py F1 Score metrics.py
Huber loss_func.py F-B Theta metrics.py
Mean Squared Log Error loss_func.py Specificity metrics.py
Mean Absolute Percentage Error loss_func.py