dyigitpolat / mimarsinan

a pipeline for modeling in-memory architectures for spiking neural networks
MIT License
6 stars 0 forks source link

implement clamp adaptation #33

Closed dyigitpolat closed 1 year ago

dyigitpolat commented 1 year ago

training with non-clamped activations is proven to be more effective. ClampedReLU should be introduced in an adaptation step.

this also gives the freedom of pre-training models with various activation functions

dyigitpolat commented 1 year ago

done