GiorgosXou / NeuralNetworks

A resource-conscious neural network implementation for MCUs
MIT License
74 stars 24 forks source link

Add support for convolutional\kernel\filter layers #33

Open GiorgosXou opened 6 months ago

GiorgosXou commented 6 months ago

My thought proccess:

I could efficiently add support for convolutional\kernel\filter layers while preserving overall performance via just a fake-activation function and a simple condition when loading weights (i.e. "activation function" defined as FILTER...). Pottentialy by squeezing the logic of what-the-size-of-filter-is under the rest of the single byte of each activation function ... for example (considering all activation functions supported by this library are 14+5 custom ones and byte-size allows use up to 255) activation function defined by name FILTERX as 22 could be 2x2, or 98 9x8 or 101 10x1 and so on... (or even better i could also shift 22 to represent 1x2 [or even by a predifined shifting variable]) like who's going to use more than 255 on an MCU... let's be honest... although I could also add such a preference too... And finally by manipulating the output via that filter\fake-activation function. +additional destructor logic

Outro: It always sounds exciting and funny until I start working on it and realise it isn't as easy as first thought lol

GiorgosXou commented 6 months ago

Life could be a dream until reality checks

GiorgosXou commented 6 months ago

SOUNDS SO GOOD TO BE TRUE

GiorgosXou commented 4 months ago

MONERO: 87PVyQ8Vt768hnvVyR9Qw1NyGzDea4q9Zd6AuwHb8tQBU9VdRYjRoBL7Ya8yRPVQakW2pjt2UWEtzYoxiRd7xpuB4XSJVAW Donate Discord Server