tests/common/test_case.py:76: in ensure_model_trains_and_loads
model.train()
deep_qa/training/trainer.py:357: in train
self._save_auxiliary_files()
deep_qa/training/text_trainer.py:223: in _save_auxiliary_files
super(TextTrainer, self)._save_auxiliary_files()
deep_qa/training/trainer.py:619: in _save_auxiliary_files
model_config = self.model.to_json()
../../anaconda/lib/python3.5/site-packages/keras/engine/topology.py:2546:
in to_json
model_config = self._updated_config()
../../anaconda/lib/python3.5/site-packages/keras/engine/topology.py:2513:
in _updated_config
config = self.get_config()
self = <deep_qa.training.models.DeepQaModel object at 0x129c48a58>
def get_config(self):
config = {
'name': self.name,
}
node_conversion_map = {}
for layer in self.layers:
if issubclass(layer.__class__, Container):
# Containers start with a pre-existing node
# linking their input to output.
kept_nodes = 1
else:
kept_nodes = 0
for original_node_index, node in enumerate(layer.inbound_nodes):
node_key = layer.name + '_ib-' + str(original_node_index)
if node_key in self.container_nodes:
node_conversion_map[node_key] = kept_nodes
kept_nodes += 1
layer_configs = []
for layer in self.layers: # From the earliest layers on.
layer_class_name = layer.__class__.__name__
layer_config = layer.get_config()
filtered_inbound_nodes = []
for original_node_index, node in enumerate(layer.inbound_nodes):
node_key = layer.name + '_ib-' + str(original_node_index)
if node_key in self.container_nodes:
# The node is relevant to the model:
# add to filtered_inbound_nodes.
if node.arguments:
try:
json.dumps(node.arguments)
kwargs = node.arguments
except TypeError:
warnings.warn(
'Layer ' + layer.name +
' was passed non-serializable keyword
arguments: ' +
str(node.arguments) + '. They will not be
included '
'in the serialized model (and thus will be
missing '
'at deserialization time).')
kwargs = {}
else:
kwargs = {}
if node.inbound_layers:
node_data = []
for i in range(len(node.inbound_layers)):
inbound_layer = node.inbound_layers[i]
node_index = node.node_indices[i]
tensor_index = node.tensor_indices[i]
node_key = inbound_layer.name + '_ib-' +
str(node_index)
new_node_index =
node_conversion_map.get(node_key, 0)
node_data.append([inbound_layer.name,
new_node_index,
tensor_index,
kwargs])
filtered_inbound_nodes.append(node_data)
layer_configs.append({
'name': layer.name,
'class_name': layer_class_name,
'config': layer_config,
'inbound_nodes': filtered_inbound_nodes,
})
config['layers'] = layer_configs
# Gather info about inputs and outputs.
model_inputs = []
for i in range(len(self.input_layers)):
layer = self.input_layers[i]
node_index = self.input_layers_node_indices[i]
node_key = layer.name + '_ib-' + str(node_index)
This is because we are using some Keras functionality which allows us to
save a model when some metric has fallen/improved. There is a default for
this in keras if your model has a single output, but as yours has 2, you
need to set it explicitly, before you call the super class.
Yes, I did that explicitly and that error is gone. But there is some other problem while accessing index of verb_array_input:
Does this look familiar to you?
tests/models/sequence_tagging/verb_semantics_model_test.py:19:
tests/common/test_case.py:76: in ensure_model_trains_and_loads model.train() deep_qa/training/trainer.py:357: in train self._save_auxiliary_files() deep_qa/training/text_trainer.py:223: in _save_auxiliary_files super(TextTrainer, self)._save_auxiliary_files() deep_qa/training/trainer.py:619: in _save_auxiliary_files model_config = self.model.to_json() ../../anaconda/lib/python3.5/site-packages/keras/engine/topology.py:2546: in to_json model_config = self._updated_config() ../../anaconda/lib/python3.5/site-packages/keras/engine/topology.py:2513: in _updated_config config = self.get_config()
self = <deep_qa.training.models.DeepQaModel object at 0x129c48a58>
arguments: ' + str(node.arguments) + '. They will not be included ' 'in the serialized model (and thus will be missing ' 'at deserialization time).') kwargs = {} else: kwargs = {} if node.inbound_layers: node_data = [] for i in range(len(node.inbound_layers)): inbound_layer = node.inbound_layers[i] node_index = node.node_indices[i] tensor_index = node.tensor_indices[i] node_key = inbound_layer.name + '_ib-' + str(node_index) new_node_index = node_conversion_map.get(node_key, 0) node_data.append([inbound_layer.name, new_node_index, tensor_index, kwargs]) filtered_inbound_nodes.append(node_data) layer_configs.append({ 'name': layer.name, 'class_name': layer_class_name, 'config': layer_config, 'inbound_nodes': filtered_inbound_nodes, }) config['layers'] = layer_configs
../../anaconda/lib/python3.5/site-packages/keras/engine/topology.py:2298: KeyError
Captured stdout call
Layer (type) Output Shape Param # Connected to Input mask Output mask
====================================================================================================================================================== word_array_input (InputLayer) (None, 8) 0 None None
word_embedding (TimeDistributedEmbeddin (None, 8, 6) 90 word_array_input[0][0] None Tensor("word_embedding/No
dropout_1 (Dropout) (None, 8, 6) 0 word_embedding[0][0] Tensor("word_embedding/No Tensor("word_embedding/No
bow_encoder_1 (BOWEncoder) (None, 6) 0 dropout_1[0][0] Tensor("word_embedding/No None
dense_2 (Dense) (None, 3) 21 bow_encoder_1[0][0] None None
time_distributed_1 (TimeDistributed) (None, 8, 3) 21 dropout_1[0][0] Tensor("word_embedding/No Tensor("word_embedding/No
Total params: 132
On Mon, May 22, 2017 at 1:55 PM, Mark Neumann notifications@github.com wrote: