udibr / headlines

Automatically generate headlines to short articles
MIT License
525 stars 150 forks source link

Prediction is not working #25

Open BenjamWhite opened 7 years ago

BenjamWhite commented 7 years ago

I'm running the same code on test data and get strange weights back.

import h5py
with h5py.File('data/%s.hdf5'%FN1, mode='r') as f:
    if 'layer_names' not in f.attrs and 'model_weights' in f:
        f = f['model_weights']
    weights = [np.copy(v) for v in f['timedistributed_1'].itervalues()]

and map(lambda x: x.shape, weights) is giving me back: [(2,)]

Also run the code with Keras 2.0.0 and the actual version. Can it be due to different versions?

Thanks in advance!

imranshaikmuma commented 7 years ago

@BenjamWhite i am trying to fix this problem from two days. please help me if you find it too..

BenjamWhite commented 7 years ago

I thought first it might be something with theano and tensorflow as they store data in different order, but changing channel_first to channel_last didn't help.

imranshaikmuma commented 7 years ago

what i feel is file is not having weights of 'time_distributed_1' layer. i googled and find nothing on this. it has only [array(['bias:0', 'kernel:0'], dtype='<U8')] inside.

BenjamWhite commented 7 years ago

well, that's a different issue, when TimeDistributed gets defined, there is a ), missing before name. So it assigns the name to the Dense method. Before:

model.add(TimeDistributed(Dense(vocab_size,
                                kernel_regularizer=regularizer, 
                                bias_regularizer=regularizer, name = 'timedistributed_1')))

After:

model.add(TimeDistributed(Dense(vocab_size,
                                kernel_regularizer=regularizer, 
                                bias_regularizer=regularizer),
                                name = 'timedistributed_1'))

There is also a method to show the content of the h5py.

Haha, that's the next error I'm getting with those values [array(['bias:0', 'kernel:0'], dtype='<U8')] the

# out very own softmax
def output2probs(output):
    print 'output:', type(output), output
    print 'weights[0]:', type(weights[0]), weights[0]
    #print 'weights[1]:', type(weights[1])
    output = np.dot(output, weights[0]) + weights[1]
    output -= output.max()
    output = np.exp(output)
    output /= output.sum()
    return output

is saying it doesn't know dtype='<U8'

imranshaikmuma commented 7 years ago

weights[0] and weights[1] should be an array of weights. we are not getting that from our trained model file. if you can solve this, everything will be automatically solved

imranshaikmuma commented 7 years ago

check this link it helped me a lot (Cell 94) https://github.com/rtlee9/recipe-summarization/blob/master/src/predict.ipynb here when we are using fit_generator, weights are produced till dropout3 only. and there is no layer for time distributed..everyone has the same problem but they ignored and moved on..

BenjamWhite commented 7 years ago

where you able to run this?

imranshaikmuma commented 7 years ago

i am able to run but i am not getting the desired output. i am using a different data like job description and generating a job title my data is huge and i am looking for gpu to train the model what about you?

BenjamWhite commented 7 years ago

didn't really work on it anymore (had no time so far), so basically still the same error. how did you get around that error before?

What is the diff between expected and real output? Is it a data problem?

elainelinlin commented 6 years ago

If you open the hdf5 file you should see the file structure. For me, I had to modify it to

import h5py
with h5py.File('data/%s.hdf5'%FN1, mode='r') as f:
    if 'layer_names' not in f.attrs and 'model_weights' in f:
        f = f['model_weights']
    weights = [np.copy(v) for v in f['time_distributed_1']['time_distributed_1'].values()]
weights = np.array([weights[1], weights[0]])
nauman-akram commented 6 years ago

@elainelinlin did you get any prediction out of it? For your own data?

elainelinlin commented 6 years ago

@fzr2009 Yes. What is not working for you?

nauman-akram commented 6 years ago

@elainelinlin I get this KeyError every time with different string in it, Please help if you can 34963263_1255056604597058_1311645615918153728_n