Closed NenadZivic closed 8 years ago
I am getting the same error too when i try to use the embedding layer. My network looks like this:
layers=[
(InputLayer, {'shape': (None,NUM_FEATURES)}),
(EmbeddingLayer, {'input_size': NUM_FEATURES, 'output_size': 750}),
(DenseLayer, {'num_units': 750, 'nonlinearity':relu}),
(DropoutLayer, {'p':0.5}),
(DenseLayer, {'num_units': 500, 'nonlinearity':relu}),
(DropoutLayer, {'p':0.5}),
(DenseLayer, {'num_units': 250}),
(DenseLayer, {'num_units':NUM_CLASSES, 'nonlinearity':softmax}),
]
net = NeuralNet(
layers=layers,
max_epochs=5,
update=nesterov_momentum,
update_learning_rate=0.001,
update_momentum=0.9,
train_split=TrainSplit(eval_size=0.25),
verbose=1,
)
did you figure out what was causing this error?
It is definitely possible, though I don't have working code here right now. My first attempt would be to set the input_var
parameter of the InputLayer
to be integer type (for instance T.imatrix
).
@NenadZivic Why is your input 4D?
InputLayer's input_var
is by default set to:
input_var_type = T.TensorType(theano.config.floatX, [False] * ndim)
Explicitly passing an input var should fix your issue. E.g.:
InputLayer((None, NUM_FEATURES), input_var=T.imatrix())
Yup, this fixed the issue. Thank you @dnouri
it is the problem, thx @dnouri
Hi, I'm trying to add an EmbeddingLayer to my network (a simple MLP net). Here is the complete code, as it is not very long:
And I get this error:
Has anyone had the same error, or I am doing something wrong? If I am, could you please provide me with a simple example of how it should be done, as I didn't manage to find any example online.
Thanks, Nenad