Closed barantunc closed 5 years ago
@barantunc, to freeze the model and do inference, you just need to run test_lanenet.py. To ensure real-time performance, you just need a light-weight backbone like ResNet-18. And I will release the light-weight model which can achieve real-time performance soon.
thanks for your brilliant work, I have the same question,I want to put the pre-training model on the vs project. I want to get the model of .pb,,so what output nodes for frozen pb? thanks for your reply
@cardwing thanks for your reply
thanks,I got the pb by adding nodes
May I ask which ones exactly? Thanks a lot
you need new a output node in the end .
@barantunc @cardwing @SuperPengXue Hello i am trying to get a .pb file from my checkpoint to use it on my android smartphone. I found that it's possible to generate it with export_inference_graph.py from tensorflow repo but for training I used vgg16 and there is no config file. https://github.com/tensorflow/models/issues/2092 https://github.com/tensorflow/models/tree/master/research/object_detection/samples/configs Command: python3 export_inference_graph.py --input_type --pipeline_config_path --trained_checkpoint_prefix --output_directory
I don't know what argument should I use for pipeline_config_pathm or is there a better way to generate the pb file and is a pbtxt file required for lane detection?
thanks,I got the pb by adding nodes
how to get the last output node?
Hello, thanks for your brilliant work.
I am relatively new to this and I am trying to test culane pretrained model real-time over webcam. However, at first I need to freeze the model. What output nodes should I keep for the frozen model to work?
Optional: any suggestions for real-time inference?
Thanks for your time