lighttransport / VisemeNet-infer

CPU inference version of VisemeNet-tensorflow
MIT License
13 stars 4 forks source link

Batch_normalization layer from model.py when running create_graphdef #2

Closed anasvaf closed 4 years ago

anasvaf commented 4 years ago

Hello, I am using TensorFlow 2.0 and I have an issue with the tf.contrib batch_norm layer. I have tried to replace it with the tf.keras.layers.BatchNormalization, since it containes similar parameters e.g., is_training --> trainable, center, etc. However, when I change the parameter scope = 'net1_land_bn' then it complains that my first input parameter to the BatchNorm layer should be an axis, hence integer. Instead of l2 = tf.keras.layers.BatchNormalization(l1, center=True, scale=True, trainable=phase, name='net1_land_bn')

Did you face any similar issue? Thank you in advance! Tasos

syoyo commented 4 years ago

We haven't tested creating .pb with TF v2.

I think you can first create model using v1, then read it in v2_use_frozen.py https://github.com/lighttransport/VisemeNet-infer/blob/master/v2_use_frozen.py to run the inference in TF v2

FYI, net1 is not used in the inference stage(it outputs facial landmark points)

anasvaf commented 4 years ago

Thank you very much for your prompt response! I really appreciate it! I managed to solve the issue.