Robbepop / prophet

A simple neural net implementation.
Other
39 stars 1 forks source link

Refactor layer sizes to be of equal sizes #4

Closed Robbepop closed 7 years ago

Robbepop commented 7 years ago

The entire neural network concrete layers were designed to be very memory efficient which resulted in a design with assymetric layer sizes (because of the way bias neurons are represented). This has led to several issues and implementation difficulties preventing the usage of more efficiently implemented lower-level algorithms provided by ndarray and other libraries. This refactoring of layer sizes also helps with implementing the new layer and topology layer types to path the way for Convolutional Layers in the end.

Robbepop commented 7 years ago

Work on this has been done some time ago. Yet, the implementation never became stable so an entire redesign of the crate was initiated in order to provide better abstractions for better maintainability.

However, the new redesigned layer architecture implements this layer-size behaviour so this issue can be closed.