Open achbogga opened 6 years ago
@achbogga what's your mean about remove the batch normalization? you used UFF parser or tensorflow 1.7' new API? I want to use tensorrt to speed up model too, please give me some advises.
Hello @jerryhouuu , I have used tensorflow 1.4 and TensorRT 4.0.0.3. Some tensorflow operations such as merge, switch, which are used in slim batch normalization module are not supported by the NVIDIA parsers yet. So, I was suggesting removing them from the graph and retraining a model. Hope it helps.
Thanks
@achbogga I am not sure your mean is to set up the normailizer_fn=None in inception_resnet_v1.py? then retraining a new model. Thank for your help ` with slim.arg_scope([slim.conv2d, slim.fully_connected], weights_initializer=slim.initializers.xavier_initializer(), weights_regularizer=slim.l2_regularizer(weight_decay),
normalizer_fn=None,
normalizer_params=batch_norm_params):
return inception_resnet_v1(images, is_training=phase_train, dropout_keep_prob=keep_probability, bottleneck_layer_size=bottleneck_layer_size, reuse=reuse)
`
Yep, Please let me know if you have some luck with it. Thanks, Achyut.
@achbogga hello, could you give me some advises about how you modified the batch normalization to re-trained model normally. Thanks for your help.
If you still need TensorRT to speedup for inference, try the modification from https://github.com/JerryJiaGit/facenet_trt It works on my GV100, TneorFlow 1.12 and TensorRT4 with SavedModel load and 30% speed up with FP16, with a few lines added into facenet.py FYI.
Hello @davidsandberg ,
I have successfully converted the entire facenet graph into tensorRT engine after removing just the batch normalization. However, I don't have access to the data to retrain the network. I would highly appreciate (I can buy you a cool gift) if you can just retrain without batch normalization and provide us the trained model file.
Thanks a million in advance. Hoping to get a response, Achyut Boggaram