Closed asimkievich closed 9 years ago
Make sure that you've got the right versions of Lasagne and nolearn installed. These have changed since the first version of the tutorial.
pip install -U -r https://raw.githubusercontent.com/dnouri/kfkd-tutorial/master/requirements.txt
Alternatively, and maybe better if you're using Conda:
pip uninstall Lasagne
pip uninstall nolearn
pip install -r https://raw.githubusercontent.com/dnouri/kfkd-tutorial/master/requirements.txt
Related: #13
hi there, I get the error below and was wondering if you could help. btw, awesome tutorial on convoluted nns. thanks!
import numpy as np import pandas as pd from sklearn.preprocessing import StandardScaler
from lasagne.layers import DenseLayer from lasagne.layers import InputLayer from lasagne.nonlinearities import tanh from lasagne.updates import nesterov_momentum from nolearn.lasagne import NeuralNet
def load_train_data(path): df = pd.read_csv(path) X = df.ix[0:19,0:4] y = df.ix[0:19,4] X = np.log(4+X) scaler = StandardScaler() X = scaler.fit_transform(X) return X, y, scaler
X, y, scaler = load_train_data('16_S_nn_2.csv') num_features = X.shape[1]
layers0 = [('input', InputLayer), ('dense0', DenseLayer)]
net0 = NeuralNet(layers=layers0,
net0.fit(X, y)
And the complete error thread is:
C:\Users........\Anaconda\lib\site-packages\lasagne-0.1dev-py2.7.egg\lasagne\init.py:30: UserWarning: The uniform initializer no longer uses Glorot et al.'s approach to determine the bounds, but defaults to the range (-0.01, 0.01) instead. Please use the new GlorotUniform initializer to get the old behavior. GlorotUniform is now the default for all layers.
warnings.warn("The uniform initializer no longer uses Glorot et al.'s " DenseLayer (None, 200) produces 200 outputs InputLayer (None, 4L) produces 4 outputs
C:\Users.....\Anaconda\lib\site-packages\lasagne-0.1dev-py2.7.egg\lasagne\layers\helper.py:52: UserWarning: get_all_layers() has been changed to return layers in topological order. The former implementation is still available as get_all_layers_old(), but will be removed before the first release of Lasagne. To ignore this warning, use", line 1, in
net0.fit(X, y)
File "C:\Users\aleja_000\Anaconda\lib\site-packages\nolearn\lasagne.py", line 145, in fit
self.y_tensor_type,
File "C:\Users\aleja_000\Anaconda\lib\site-packages\nolearn\lasagne.py", line 295, in _create_iter_funcs
updates = update(loss_train, all_params, **update_params)
File "C:\Users.....\Anaconda\lib\site-packages\lasagne-0.1dev-py2.7.egg\lasagne\updates.py", line 260, in nesterov_momentum
updates = sgd(loss_or_grads, params, learning_rate)
File "C:.....\aleja_000\Anaconda\lib\site-packages\lasagne-0.1dev-py2.7.egg\lasagne\updates.py", line 74, in sgd
grads = get_or_compute_grads(loss_or_grads, params)
File "C:\Users.......\Anaconda\lib\site-packages\lasagne-0.1dev-py2.7.egg\lasagne\updates.py", line 51, in get_or_compute_grads
return theano.grad(loss_or_grads, params)
File "C:\Users.........\Anaconda\lib\site-packages\theano\gradient.py", line 432, in grad
raise TypeError("cost must be a scalar.")
TypeError: cost must be a scalar.
warnings.filterwarnings('ignore', '.*topo.*')
. warnings.warn("get_all_layers() has been changed to return layers in " Traceback (most recent call last): File "C:\Users.......\Anaconda\lib\site-packages\IPython\core\interactiveshell.py", line 2883, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "