itdxer / neupy

NeuPy is a Tensorflow based python library for prototyping and building neural networks
http://neupy.com
MIT License
742 stars 160 forks source link

Bayesian Regularization #236

Closed Ohayoosan closed 5 years ago

Ohayoosan commented 5 years ago

Can you add Bayesian Regularization to neupy?

itdxer commented 5 years ago

Do you have references to some papers that apply this technique in the context of neural networks?

Ohayoosan commented 5 years ago

Here some of reference to bayesian regularization. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0188553

itdxer commented 5 years ago

To be honest, I'm a bit skeptical. It looks like BR is quite slow (because of the inverse hessian). It adds improvement, but I'm not sure whether the same improvement won't be observed with simple L2 regularisation. Also, I don't see any prove and set of the assumptions, that has been made for BR (they just have brief overview of the algorithm). Also, in the modified cost function, they use squared error (could be because LM algorithm requires it) but I'm not sure whether it will work for any other function.

Do you know some paper that compares BR to some other regularisation methods like L2 or L1 regularisation? (didn't find anything useful after quick search)

Ohayoosan commented 5 years ago

Unfortunately, I did not find the journal you mentioned. I am new to the neural network and I got the assignment from my lecturer to make BR neural network using python as main language. I struggled on how to adding the modified cost function to Levenberg-Marquardt function in neupy. Sorry for bothered you.

itdxer commented 5 years ago

It's a bit tricky for the LM algorithm, since at each training step algorithm expects that loss function is MSE. It's possible to update loss function, even thought it's a bit hacky

import tensorflow as tf
from neupy import algorithms
from neupy.layers import *

def modified_mse(actual, expected):
    # Loss funtion can be modified here
    return tf.reduce_mean(tf.square(actual - expected))

class LMwithBR(algorithms.LevenbergMarquardt):
    loss = None

    def __init__(self, *args, **kwargs):
        self.loss = modified_mse
        super(LMwithBR, self).__init__(*args, **kwargs)

optimizer = LMwithBR(Input(10) >> Sigmoid(5))
itdxer commented 5 years ago

I'm closing this issue for now, since I don't plan adding BR in the library any time soon. Feel free to open new ticket if you have some problems with the code that I've added in my previous comment.

wangq-ntu commented 4 years ago

It's a bit tricky for the LM algorithm, since at each training step algorithm expects that loss function is MSE. It's possible to update loss function, even thought it's a bit hacky

import tensorflow as tf
from neupy import algorithms
from neupy.layers import *

def modified_mse(actual, expected):
    # Loss funtion can be modified here
    return tf.reduce_mean(tf.square(actual - expected))

class LMwithBR(algorithms.LevenbergMarquardt):
    loss = None

    def __init__(self, *args, **kwargs):
        self.loss = modified_mse
        super(LMwithBR, self).__init__(*args, **kwargs)

optimizer = LMwithBR(Input(10) >> Sigmoid(5))

I am also try to using Lm with BR in pytorch. is your code can use in pytorch?

itdxer commented 4 years ago

unfortunately NeuPy works only with tensorflow and the same code won't work in pytorch

wangq-ntu commented 4 years ago

unfortunately NeuPy works only with tensorflow and the same code won't work in pytorch

Ok, Thanks.