tensorflow / models

Models and examples built with TensorFlow
Other
77.16k stars 45.75k forks source link

slim API not support tensorflow r1.0 API #1135

Closed vaklyuenkov closed 7 years ago

vaklyuenkov commented 7 years ago

After updating to tf r1.0 have error, when trying to run training models/slim/ on "Flowers":


WARNING:tensorflow:From train_image_classifier.py:474: softmax_cross_entropy (from tensorflow.contrib.losses.python.losses.loss_ops) is deprecated and will be removed after 2016-12-30.
Instructions for updating:
Use tf.losses.softmax_cross_entropy instead.
Traceback (most recent call last):
  File "train_image_classifier.py", line 585, in <module>
    tf.app.run()
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/platform/app.py", line 44, in run
    _sys.exit(main(_sys.argv[:1] + flags_passthrough))
  File "train_image_classifier.py", line 482, in main
    clones = model_deploy.create_clones(deploy_config, clone_fn, [batch_queue])
  File "/media/Disk/CNN/Classification/slim/deployment/model_deploy.py", line 195, in create_clones
    outputs = model_fn(*args, **kwargs)
  File "train_image_classifier.py", line 474, in clone_fn
    label_smoothing=FLAGS.label_smoothing, weight=0.4, scope='aux_loss')
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 117, in new_func
    return func(*args, **kwargs)
TypeError: softmax_cross_entropy() got an unexpected keyword argument 'weight'

So, Instead:


    #############################
     # Specify the loss function #
     #############################
      if 'AuxLogits' in end_points:
        slim.losses.softmax_cross_entropy(
            end_points['AuxLogits'], labels,
            label_smoothing=FLAGS.label_smoothing, weight=0.4, scope='aux_loss')
      slim.losses.softmax_cross_entropy(
          logits, labels, label_smoothing=FLAGS.label_smoothing, weight=1.0)
      return end_points

I use:

 #############################
      # Specify the loss function #
      #############################
      if 'AuxLogits' in end_points:
            tf.losses.softmax_cross_entropy(
    end_points['AuxLogits'], labels, weights=0.4,
            label_smoothing=FLAGS.label_smoothing, scope='aux_loss')

      tf.losses.softmax_cross_entropy(
          logits, labels, weights=1.0, label_smoothing=FLAGS.label_smoothing)
      return end_points

The same for tf.contrib.slim It is very important to understand -is it correct loss function calculation?

Thank you!

vaklyuenkov commented 7 years ago

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/losses/python/losses/loss_ops.py

@deprecated("2016-12-30", "Use tf.losses.softmax_cross_entropy instead. Note that the order " "of the logits and labels arguments has been changed.")