tensorflow / adanet

Fast and flexible AutoML with learning guarantees.
https://adanet.readthedocs.io
Apache License 2.0
3.47k stars 529 forks source link

bias not found in checkpoint #118

Closed augustodelscenario closed 4 years ago

augustodelscenario commented 4 years ago

Hello, I`m trying to make Adanet objective example with Keras Dense layers, but have problems. Can you help please with this error?

2019-07-19 07:54:58.178313: W tensorflow/core/framework/op_kernel.cc:1502] OP_REQUIRES failed at save_restore_v2_ops.cc:184 : Not found: Key adanet/iteration_0/subnetwork_t0_2_layer_dnn/dense/bias not found in checkpoint
Traceback (most recent call last):
  File "C:\Python37\lib\site-packages\tensorflow\python\client\session.py", line 1356, in _do_call
    return fn(*args)
  File "C:\Python37\lib\site-packages\tensorflow\python\client\session.py", line 1341, in _run_fn
    options, feed_dict, fetch_list, target_list, run_metadata)
  File "C:\Python37\lib\site-packages\tensorflow\python\client\session.py", line 1429, in _call_tf_sessionrun
    run_metadata)
tensorflow.python.framework.errors_impl.NotFoundError: Key adanet/iteration_0/subnetwork_t0_2_layer_dnn/dense/bias not found in checkpoint
     [[{{node save/RestoreV2}}]]

Everything work fine, when i use such build def:

def build_subnetwork(self,
                         features,
                         logits_dimension,
                         training,
                         iteration_step,
                         summary,
                         previous_ensemble=None):
        input_layer = tf.to_float(features[FEATURES_KEY])
        kernel_initializer = tf.glorot_uniform_initializer(seed=self._seed)
        last_layer = input_layer
        for _ in range(self._num_layers):
            last_layer = tf.layers.dense(
                last_layer,
                units=self._layer_size,
                activation=tf.nn.relu,
                kernel_initializer=kernel_initializer)
        logits = tf.layers.dense(
            last_layer,
            units=logits_dimension,
            kernel_initializer=kernel_initializer)
        persisted_tensors = {_NUM_LAYERS_KEY: tf.constant(self._num_layers)}
        return adanet.Subnetwork(
            last_layer=last_layer,
            logits=logits,
            complexity=self._measure_complexity(),
            persisted_tensors=persisted_tensors)

But if i want to move to Keras layers, i`ve got error. build will look like this:

    def build_subnetwork(self,
                         features,
                         logits_dimension,
                         training,
                         iteration_step,
                         summary,
                         previous_ensemble=None):
        input_layer = tf.to_float(features[FEATURES_KEY])
        kernel_initializer = tf.glorot_uniform_initializer(seed=self._seed)
        last_layer = input_layer
        for _ in range(self._num_layers):
            last_layer = tf.keras.layers.Dense(
                units=64, activation="relu", kernel_initializer=kernel_initializer,
                use_bias=True, bias_initializer=kernel_initializer)(last_layer)
        logits = tf.keras.layers.Dense(
            units=2, activation=None, kernel_initializer=kernel_initializer,
            use_bias=True, bias_initializer=kernel_initializer)(last_layer)
        persisted_tensors = {_NUM_LAYERS_KEY: tf.constant(self._num_layers)}
        return adanet.Subnetwork(
            last_layer=last_layer,
            logits=logits,
            complexity=self._measure_complexity(),
            persisted_tensors=persisted_tensors)

What i made wrong?

cweill commented 4 years ago

Hi @augustodelscenario: this is a known issue, see #100. While we work on resolving it, please keep using tf.layers.