tensorflow / adanet

Fast and flexible AutoML with learning guarantees.
https://adanet.readthedocs.io
Apache License 2.0
3.47k stars 527 forks source link

tf.data to input_fn() #92

Closed tuttlebr closed 2 years ago

tuttlebr commented 5 years ago

Hello,

Can someone please share how to pass a tf.data API to the input_fn if the adanet estimator? I am using the customizing_adanet.ipynb as a starting point. I have the function below which accepts file paths as string and labels as integers. The load_and_preprocess_from_path_label function map function will take these strings of file paths and return {"images":image tensor}, labels.

def preprocess_image(image):
    image = tf.image.decode_jpeg(image, channels=3)
    image = tf.image.resize_images(image, [image_dim, image_dim])
    image /= 255.0  # normalize to [0,1] range
    return image

def load_and_preprocess_image(path):
    image = tf.read_file(path)
    return preprocess_image(image)

# The tuples are unpacked into the positional arguments of the mapped function 
def load_and_preprocess_from_path_label(path, label):

    return {"images":load_and_preprocess_image(path)}, label

def input_fn(features_path, labels_code, batch_size):

    def _input_fn():      
        # convert training data to tensors
        dataset = tf.data.Dataset.from_tensor_slices((features_path, labels_code))
        dataset = dataset.map(load_and_preprocess_from_path_label, num_parallel_calls=AUTOTUNE)

        # Shuffle, repeat, and batch the examples.
        dataset = dataset.shuffle(len(labels_code)).repeat().batch(batch_size)
        iterator = dataset.make_one_shot_iterator()
        features, labels = iterator.get_next()
        return features, labels

    return _input_fn

The error i receive, when running the example, is below: ValueError: logits shape must be [D0, D1, ... DN, logits_dimension], got (?, 10).

LiberiFatali commented 5 years ago

You should check your last layer of the network For example: logits = tf.layers.dense(x, units=NUM_CLASSES, activation=None, kernel_initializer=kernel_initializer I also had this error when the output of the last softmax layer does not match the number of classes