RSYashwanth / goNN

1 stars 1 forks source link

New Structure Proposition #1

Closed 1611Dhruv closed 2 days ago

1611Dhruv commented 4 days ago
neuralnet/
├── cmd/
│   └── example/          # Example applications using the library
│       └── main.go       # Example code showing how to use the library
├── internal/             # Internal utilities not exposed to the user
│   ├── data/             # Data loading, preprocessing, and augmentation utilities
│   ├── math/             # Specialized math functions or utilities
│   └── logger.go         # Logging utility for debugging and monitoring
├── pkg/                  # Core library packages
│   ├── activations/      # Activation functions (e.g., ReLU, Sigmoid, Tanh)
│   │   └── activations.go
│   ├── layers/           # Neural network layers (e.g., Dense, Conv, RNN)
│   │   └── dense.go
│   ├── loss/             # Loss functions (e.g., MSE, Cross-Entropy)
│   │   └── loss.go
│   ├── metrics/          # Metrics for model evaluation (e.g., accuracy, F1-score)
│   │   └── metrics.go
│   ├── models/           # Model construction and management (e.g., Sequential, Functional API)
│   │   └── sequential.go
│   ├── optimizers/       # Optimizers (e.g., SGD, Adam, RMSProp)
│   │   └── sgd.go
│   ├── utils/            # General utilities (e.g., tensor manipulation, random seeds)
│   │   └── tensors.go
│   └── training/         # Training loop, validation, checkpointing
│       └── training.go
├── tests/                # Unit tests and integration tests
│   ├── activations_test.go
│   ├── layers_test.go
│   └── training_test.go
├── examples/             # Standalone examples or demos
│   ├── mnist/            # MNIST example
│   │   ├── main.go
│   │   └── utils.go
│   └── iris/             # Iris dataset example
│       └── main.go
├── go.mod                # Go module file
├── go.sum                # Dependency checksum file
└── README.md             # Project documentation
1611Dhruv commented 4 days ago

Useful Math functions and libraries:

ChatGPT TL;DR

1. Linear Algebra Operations


2. Tensor Operations


3. Activation Functions


4. Loss Functions


5. Optimizer Support Functions


6. Probability & Statistics


7. Utility Functions


8. Advanced Functions (Optional)


How to Organize These Functions

We can consider some useful ones to start off with?

1611Dhruv commented 4 days ago

https://www.youtube.com/watch?v=pauPCy_s0Ok&ab_channel=TheIndependentCode Useful Video, Here is my suggestion for the structure

NN UML

classDiagram
    class Tensor {
        +float[] data
        +int[] shape
        +apply(func: Function): Tensor
    }
classDiagram

    class Layer {
        <<abstract>>
        +forward(input: Tensor): Tensor
        +backward(grad: Tensor, optimizer: Optimizer): Tensor
    }

    class DenseLayer {
        -Tensor weights
        -Tensor biases
        +forward(input: Tensor): Tensor
        +backward(grad: Tensor, optimizer: Optimizer): Tensor
    }

   class ReLU {
        +forward(input: Tensor): Tensor
        +backward(grad: Tensor, optimizer: Optimizer): Tensor
    }

 class Conv2D {
        -Tensor[] kernels
        +forward(input: Tensor): Tensor
        +backward(grad: Tensor, optimizer: Optimizer): Tensor
    }
 class MaxPool {
        +forward(input: Tensor): Tensor
        +backward(grad: Tensor, optimizer: Optimizer): Tensor
    }

    Layer <|-- DenseLayer
    Layer <|-- ReLU
   Layer <|-- Conv2D
  Layer <|--MaxPool
classDiagram
 class Optimizer {
        <<interface>>
        +step(params: Tensor, grads: Tensor): void
    }

    class SGD {
        -float learningRate
        +step(params: Tensor, grads: Tensor): void
    }
    Optimizer <|-- SGD
1611Dhruv commented 4 days ago

ALSO YOU CAN DO THISSSSS

mkdir -p neuralnet/pkg/{activations,layers,loss,metrics,models,optimizers,utils,training}