jeffheaton / t81_558_deep_learning

T81-558: Keras - Applications of Deep Neural Networks @Washington University in St. Louis
https://sites.wustl.edu/jeffheaton/t81-558/
Other
5.72k stars 3.03k forks source link

getting keyerror, please help me out #90

Closed ramidim closed 3 years ago

ramidim commented 3 years ago

number_pics_per_bath = 3 steps = len(train_descriptions)//number_pics_per_bath model_path = os.path.join(root_captioning,"data",f'caption-model.hdf5') if not os.path.exists(model_path): for i in tqdm(range(EPOCHS*2)): generator = data_generator(train_descriptions, encoding_train, wordtoidx, max_length, number_pics_per_bath) caption_model.fit(generator, epochs=1, steps_per_epoch=steps, verbose=1) caption_model.optimizer.lr = 1e-4 number_pics_per_bath = 6 steps = len(train_descriptions)//number_pics_per_bath for i in range(EPOCHS): generator = data_generator(train_descriptions, encoding_train, wordtoidx, max_length, number_pics_per_bath) caption_model.fit(generator, epochs=1, steps_per_epoch=steps, verbose=1)
caption_model.save_weights(model_path) print(f"\Training took: {hms_string(time()-start)}") else: caption_model.load_weights(model_path)

############################################################################################### 0%| | 0/20 [00:00<?, ?it/s]

KeyError Traceback (most recent call last)

in () 5 for i in tqdm(range(EPOCHS*2)): 6 generator = data_generator(train_descriptions, encoding_train, wordtoidx, max_length, number_pics_per_bath) ----> 7 caption_model.fit(generator, epochs=1, steps_per_epoch=steps, verbose=1) 8 caption_model.optimizer.lr = 1e-4 9 number_pics_per_bath = 6 4 frames /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing) 1062 use_multiprocessing=use_multiprocessing, 1063 model=self, -> 1064 steps_per_execution=self._steps_per_execution) 1065 1066 # Container that configures and calls `tf.keras.Callback`s. /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/data_adapter.py in __init__(self, x, y, sample_weight, batch_size, steps_per_epoch, initial_epoch, epochs, shuffle, class_weight, max_queue_size, workers, use_multiprocessing, model, steps_per_execution) 1110 use_multiprocessing=use_multiprocessing, 1111 distribution_strategy=ds_context.get_strategy(), -> 1112 model=model) 1113 1114 strategy = ds_context.get_strategy() /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/data_adapter.py in __init__(self, x, y, sample_weights, workers, use_multiprocessing, max_queue_size, model, **kwargs) 777 # Since we have to know the dtype of the python generator when we build the 778 # dataset, we have to look at a batch to infer the structure. --> 779 peek, x = self._peek_and_restore(x) 780 peek = self._standardize_batch(peek) 781 peek = _process_tensorlike(peek) /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/data_adapter.py in _peek_and_restore(x) 834 @staticmethod 835 def _peek_and_restore(x): --> 836 peek = next(x) 837 return peek, itertools.chain([peek], x) 838 in data_generator(descriptions, photos, wordtoidx, max_length, num_photos_per_batch) 8 for key, desc_list in descriptions.items(): 9 n+=1 ---> 10 photo = photos[key+'.jpg'] 11 # Each photo has 5 descriptions 12 for desc in desc_list: KeyError: '1000268201_693b08cb0e.jpg'
jeffheaton commented 3 years ago

which notebook are you reporting an issue with? I do not understand the context of your question.

ramidim commented 3 years ago

I am using google collab ...when I try to execute above code it has thrown error

jeffheaton commented 3 years ago

Are you trying to run one of my notebooks?

ramidim commented 3 years ago

Yes, I am trying to run Image caption generator

jeffheaton commented 3 years ago

I cannot reproduce the issue on my notebook. This is probably a better question for StackOverflow or similar.