Closed KushalDave closed 5 years ago
@lvapeab Here is the stack trace, if I uncomment the line:
Traceback (most recent call last):
File "NMTKeras\TrainGen_Attention.py", line 351, in
Hi @KushalDave ,
It seemed that with multi_gpu_models
, Keras generates two Savers for Tensorboard.
I've just uploaded a (dirty) fix to that (https://github.com/MarcBS/keras/commit/382e69c6eb732cb13b69bd7c408fdfc4248c67fc, https://github.com/lvapeab/multimodal_keras_wrapper/commit/14547a179f0ff4a99285f4807c1ae13d23c88b5c). I've tested it and seems to work properly. You should update the Keras
and Multimodal Keras Wrapper
repositories.
Cheers.
Hi again,
we've decide to temporarily disable the embeddings data options in the Tensorboard
callback and rely on the one from the original Keras
repo. I therefore close this issue.
Cheers.
I put up a model that builds on the TranslationModel that you have written. The belowcode stops working if I uncomment the following line:
callbacks.append(callback_tensorboard)
I looked at this: https://github.com/keras-team/keras/issues/6988 "I've come across this issue and it was occurring because my combined models were cascaded, thus the Tensorboard callback has a hard time finding given layers (you might be able to give it a proper input like "model2/layer3" but my attempts at that have failed, I'm quite new at this)." Do you have a solution for this?
``if name == 'main': import string import re import numpy as np import tensorflow as tf import cloudpickle as cloudpk import os from DataGeneratorAttention import DataGeneratorAttention from DataGenerator import DataGenerator from keras.callbacks import ModelCheckpoint from keras.utils import Sequence from keras.models import Sequential from keras.layers import LSTM from keras.layers import Dense from keras.layers import Embedding from keras.layers import RepeatVector from keras.layers import TimeDistributed
from custom_recurrents import AttentionDecoder