Closed zwy1996 closed 3 years ago
Hi @zwy1996,
Thanks for your interest in our work!
Please find the script for Cityscapes below. I believe this is the set of hyper-parameters that we experimented with. I didn't test this script with this codebase so might not be fully compatible. (Currently, I don't have a compatible environment to test this GitHub codebase...) It would be okay if you just take the hyper-parameters from here.
Best regards, Jyh-Jing
# Set up parameters for training.
BATCH_SIZE=8
TRAIN_INPUT_SIZE=720,720
WEIGHT_DECAY=5e-4
ITER_SIZE=1
NUM_STEPS=150000
NUM_CLASSES=19
LEARNING_RATE=3e-3
NUM_GPU=4
# Set up parameters for inference.
INFERENCE_INPUT_SIZE=720,720
INFERENCE_STRIDES=480,480
INFERENCE_SPLIT=val
# Set up parameters for vmf.
NUM_CLUSTERS=8
PROTOTYPE_NUM_CLUSTERS=16
KMEANS_ITERATIONS=15
# Set up path for saving models.
SNAPSHOT_DIR=snapshots/cityscapes/segsort/segsort_mgpu_lr3e-3_it150k
# Set up the procedure pipeline.
IS_TRAIN=1
IS_PROTOTYPE=1
IS_INFERENCE=1
IS_BENCHMARK=1
# Update PYTHONPATH.
export PYTHONPATH=`pwd`:$PYTHONPATH
# Set up the data directory.
DATAROOT=/ssd/jyh/datasets
# Train.
if [ ${IS_TRAIN} -eq 1 ]; then
python3 pyscripts/train/train_vmf_mgpu.py\
--snapshot-dir ${SNAPSHOT_DIR}\
--restore-from snapshots/imagenet/trained/resnet_v1_101.ckpt\
--data-list dataset/cityscapes/train.txt\
--data-dir ${DATAROOT}/Cityscapes/\
--batch-size ${BATCH_SIZE}\
--save-pred-every 10000\
--update-tb-every 500\
--input-size ${TRAIN_INPUT_SIZE}\
--learning-rate ${LEARNING_RATE}\
--weight-decay ${WEIGHT_DECAY}\
--iter-size ${ITER_SIZE}\
--num-classes ${NUM_CLASSES}\
--num-steps $(($NUM_STEPS+1))\
--num-gpu ${NUM_GPU}\
--random-mirror\
--random-scale\
--random-crop\
--not-restore-classifier\
--is-training\
--num_clusters ${NUM_CLUSTERS}\
--kmeans_iterations ${KMEANS_ITERATIONS}
fi
# Prototype.
if [ ${IS_PROTOTYPE} -eq 1 ]; then
python3 pyscripts/inference/prototype_embedding.py\
--data-dir ${DATAROOT}/Cityscapes/\
--data-list dataset/cityscapes/train.txt\
--input-size ${INFERENCE_INPUT_SIZE}\
--strides ${INFERENCE_STRIDES}\
--restore-from ${SNAPSHOT_DIR}/model.ckpt-${NUM_STEPS}\
--num-classes ${NUM_CLASSES}\
--ignore-label 255\
--num_clusters ${PROTOTYPE_NUM_CLUSTERS}\
--kmeans_iterations ${KMEANS_ITERATIONS}\
--save-dir ${SNAPSHOT_DIR}/results/train
fi
# Inference.
if [ ${IS_INFERENCE} -eq 1 ]; then
python3 pyscripts/inference/inference_vmf_msc.py\
--data-dir ${DATAROOT}/Cityscapes/\
--data-list dataset/cityscapes/${INFERENCE_SPLIT}.txt\
--input-size ${INFERENCE_INPUT_SIZE}\
--strides ${INFERENCE_STRIDES}\
--restore-from ${SNAPSHOT_DIR}/model.ckpt-${NUM_STEPS}\
--colormap misc/colormapcs.mat\
--num-classes ${NUM_CLASSES}\
--ignore-label 255\
--num_clusters ${NUM_CLUSTERS}\
--kmeans_iterations ${KMEANS_ITERATIONS}\
--save-dir ${SNAPSHOT_DIR}/results/${INFERENCE_SPLIT}\
--prototype_dir ${SNAPSHOT_DIR}/results/train/prototypes
fi
# Benchmark.
if [ ${IS_BENCHMARK} -eq 1 ]; then
python3 pyscripts/benchmark/benchmark_by_mIoU.py\
--pred-dir ${SNAPSHOT_DIR}/results/${INFERENCE_SPLIT}/gray/\
--gt-dir ${DATAROOT}/Cityscapes/gtFineId/${INFERENCE_SPLIT}/all/\
--num-classes ${NUM_CLASSES}\
--string-replace leftImg8bit,gtFineId_labelIds
fi
I will try this. Thank you very much!
Hello @jyhjinghwang , thank you very much for your code.
I would like to ask you where I can find the pyscripts/inference/inference_vmf_msc.py
method. In pyscripts/inference
, I see only inference_vmf.py
, inference_msc.py
, inference_vmf_embedding.py
, and inference_segsort_msc.py
(these seem to be a bit related by name).
Thank you very much in advance for your reply.
Hi @vobecant , Can you elaborate on why you need inference_vmf_msc.py? I think we mainly use inference_segsort / inference_segsort_msc in this github. Also, inference_vmf_embedding.py is actually multi-scale inference that might serve your purpose. Thanks.
I need it to run the script that you posted in your comment above (from 30th Jan, 2021).
Oh I see! I think you can safely change it to inference_segsort_msc.py.
Thank you!
Hello, nice work!
I am sorry to bother you about the files of cityscapes. When I used your code , I can run the code easily, but I found I can not run the experiments on the cityscapes dataset. Could you offer the training scripts and parameters and so on about the cityscapes? Or I can use other code ?
Thank you very much!