GiorgosXou / NeuralNetworks

A resource-conscious neural network implementation for MCUs
MIT License
70 stars 21 forks source link

Add support for convolutional\kernel\filter layers #33

Open GiorgosXou opened 1 month ago

GiorgosXou commented 1 month ago

My thought proccess:

I could efficiently add support for convolutional\kernel\filter layers while preserving overall performance via just a fake-activation function and a simple condition when loading weights (i.e. "activation function" defined as FILTER...). Pottentialy by squeezing the logic of what-the-size-of-filter-is under the rest of the single byte of each activation function ... for example (considering all activation functions supported by this library are 14+5 custom ones and byte-size allows use up to 255) activation function defined by name FILTERX as 22 could be 2x2, or 98 9x8 or 101 10x1 and so on... (or even better i could also shift 22 to represent 1x2 [or even by a predifined shifting variable]) like who's going to use more than 255 on an MCU... let's be honest... although I could also add such a preference too... And finally by manipulating the output via that filter\fake-activation function. +additional destructor logic

Outro: It always sounds exciting and funny until I start working on it and realise it isn't as easy as first thought lol

GiorgosXou commented 1 month ago

Life could be a dream until reality checks

GiorgosXou commented 1 month ago

SOUNDS SO GOOD TO BE TRUE