Open AlexandraZajdel opened 4 years ago
@AleksandraPestka I'm also facing the same issue. Were you able to resolve this?
@ankitksharma No, unfortunately
@AleksandraPestka I wrote an export script which basically writes frozen_inference_graph.pb
which you can use in demo_script
. I hope it will be of help to you
python "${WORK_DIR}"/export_model.py \
--checkpoint_path="${TRAIN_LOGDIR}/model.ckpt-500000" \
--export_path="${TRAIN_LOGDIR}/frozen_inference_graph.pb" \
--model_variant="mobilenet_v2" \
--num_classes=2 \
--atrous_rates=6 \
--atrous_rates=12 \
--atrous_rates=18 \
--output_stride=16 \
--decoder_output_stride=4 \
--train_crop_size=256 \
--train_crop_size=256 \
--crop_size=256 \
--crop_size=256 \
--inference_scales=1.0 \
--save_inference_graph=true
Running train-pqr.sh doesn't cause any errors, the model is saved. But running eval-pqr.sh gives the error below:
Before you ask, I've the same train_crop_size and eval_crop_size in the mentioned scripts.