jorgenkg / python-neural-network

This is an efficient implementation of a fully connected neural network in NumPy. The network can be trained by a variety of learning algorithms: backpropagation, resilient backpropagation and scaled conjugate gradient learning. The network has been developed with PYPY in mind.
BSD 2-Clause "Simplified" License
297 stars 98 forks source link

Getting "nan" error #28

Open NourO93 opened 5 years ago

NourO93 commented 5 years ago

I am trying to implement a simple example with scaled conjugate gradient. This is my code

` dataset = [Instance( [0,0], [0] ), Instance( [1,0], [1] ), Instance( [0,1], [1] ), Instance( [1,1], [0] )] settings = { "n_inputs" : 2, "layers" : [ (2, sigmoid_function), (1, sigmoid_function) ] }

network        = NeuralNet( settings )
training_set   = dataset
test_set       = dataset
cost_function  = cross_entropy_cost

scaled_conjugate_gradient(
# Required parameters
network,                     # the neural network instance to train
training_set,                # the training dataset
test_set,                    # the test dataset
cost_function,               # the cost function to optimize

# Optional parameters
ERROR_LIMIT          = 1e-3, # Error tolerance when terminating the learning
max_iterations       = (),   # Regardless of the achieved error, terminate after max_iterations epochs. Default: infinite
print_rate           = 1000, # The epoch interval to print progression statistics
save_trained_network = False # Whether to ask the user if they would like to save the network after training
)

`

The output at each epoch looks something like this:

[training] Current error: nan Epoch: 1000

It never changed from nan and I can't figure out why