Closed remic33 closed 5 years ago
Hey @remic33 , the tf.contrib
module is deprecated in TF 2.0
https://www.tensorflow.org/beta/guide/upgrade
I know about tf.contrib being deprecated, I tried to find if that particular function was somewhere with the new tensorflow, because for most of the contrib, it had been had to core. But can't find the tf.contrib.train now.
@remic33, a quick and dirty solution is to comment out the hparam_pb2 references in hparam.py (one assertion, ops.register_proto_function, to_proto and the include) and try it this way. It seems to be working for me.
Also, by the way, if you are open to exploring a third-party package. I published this Python package for managing HParams
: https://github.com/PetrochukM/HParams
Apologies, but this issue tracker is for TF Community governance. I encourage you to address questions to discuss@tensorflow.org or StackOverflow.
Does anyone have a solution for this? Is there an equivalent of tf.contrib.training.HParams
in tensorflow 2.0 or not?
Does anyone have a solution for this? Is there an equivalent of
tf.contrib.training.HParams
in tensorflow 2.0 or not?
Looks like there is a fork in tensor2tensor: https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/utils/hparam.py
But alternatively you might want to check out: https://github.com/keras-team/keras-tuner
So I worked on this a bit. For what I needed there was a workaround since the code I was looking at wasn't making much use of HParams, I could replace it in an ad-hoc way.
However, fixing this error just led me to a new TF 1.4 --> 2.0 incompatibility.
I would advise anyone encountering the same kind of error to not try to fix it. TF 1.4 --> 2.0 breaks a lot of things. For my specific need, I think it will be easiest to start a brand new project using TF 2.0, do what I need to do in terms of optimizing my model, and then manually write that into the TF 1.4 code as a model.
There is a tf_upgrade_v2
procedure on the tensorflow website (https://www.tensorflow.org/guide/upgrade), but I cannot vouch for how likely it is to actually work and how much time it will take you.
Looks like they were deprecated but still used here: https://github.com/tensorflow/tensorboard/blob/master/tensorboard/plugins/hparams/api.py
@remic33, a quick and dirty solution is to comment out the hparam_pb2 references in hparam.py (one assertion, ops.register_proto_function, to_proto and the include) and try it this way. It seems to be working for me.
worked for me
I saw a post favouring keras-tuner to hparams as hparams was removed from contrib. I don't think they realised it moved to plugins as stated above. From my limited reading it looks like using Tensorboard's hparams might allow you more useful analysis within Tensorboard. Is there any advantage using keras-tuner vs hparams?
@remic33, a quick and dirty solution is to comment out the hparam_pb2 references in hparam.py (one assertion, ops.register_proto_function, to_proto and the include) and try it this way. It seems to be working for me.
worked for me
Could you please explain this in a bit more detail. I have been fighting this for hours now. Where can I find hparam.py? I only have hparam.json in my models. Thank you
A simple, Pythonic solution:
def get_hparams(**kwargs):
return namedtuple('GenericDict', kwargs.keys())(**kwargs)
hparams = get_hparams(param1=“P1”, param2=p2, ...)
In TF 2.0 there is a new API tensorboard.plugins.hparams.api that includes a class HParam
Usage of the new API is described in this guide: Hyperparameter Tuning with the HParams Dashboard
def get_hparams(kwargs): return namedtuple('GenericDict', kwargs.keys())(kwargs)
this worked. also need:
from collections import namedtuple
I am using AutoAugment util from tensorflow object api for data augmentation. That util use tf.contrib.training.HParams and I want to use it in tf 2.0 rc. Can't find where that function is hidden now, any help?
Try this approach:
1st: Download tensorflow addons from https://github.com/tensorflow/addons
Then move the tensorflow_addons folder to ..tensorflow\Lib\site-packages where all the packages contain
2nd: change the red higlight to green
3nd:
This work for me.. source: @https://github.com/hrsma2i/kaggle-imaterialist2020-model/pull/12/commits/735b8fa61d9da66e8a91e87f6e1276c815045e76
A simple, Pythonic solution:
def get_hparams(**kwargs): return namedtuple('GenericDict', kwargs.keys())(**kwargs) hparams = get_hparams(param1=“P1”, param2=p2, ...)
it also worked for me also need: from collections import namedtuple thanks a lot
I am using AutoAugment util from tensorflow object api for data augmentation. That util use tf.contrib.training.HParams and I want to use it in tf 2.0 rc. Can't find where that function is hidden now, any help?