Closed chenxshuo closed 1 year ago
Please try the following script, where I add --share_up_sampler
in the command of multiple_adapters.sh
. Thx.
task=multitask
# or bart
model="bart"
echo $model
if [ $model == "t5" ]
then
folder_prefix="VLT5"
backbone="t5-base"
batch_size=300
elif [ $model == "bart" ]
then
folder_prefix="VLBart"
backbone="facebook/bart-base"
batch_size=500
fi
echo $folder_prefix
echo $backbone
feature=RN101
lr=3e-4
name=4tasks_hard_${feature}_LMUpSharedAdapter+r8+ln_bs${batch_size}_image224_lr${lr}
output=snap/${folder_prefix}_${task}/$name
TOKENIZERS_PARALLELISM=True PYTHONPATH=$PYTHONPATH:./src \
python -m torch.distributed.launch \
--nproc_per_node=$1 \
--master_port=26757 \
src/${task}.py \
--distributed --multiGPU \
--optim adamw \
--warmup_ratio 0.1 \
--clip_grad_norm 5 \
--lr ${lr} \
--epochs 20 \
--num_workers 4 \
--backbone ${backbone} \
--output $output ${@:2} \
--num_beams 5 \
--batch_size ${batch_size} \
--valid_batch_size ${batch_size} \
--use_adapter \
--share_up_sampler \
--unfreeze_layer_norms \
--reduction_factor 8 \
--tasks "vqa,gqa,nlvr,caption" \
--feature ${feature} --n_boxes 36 --downsample \
--image_size "(224,224)" \
--run_name $name
Thanks again and let me try this script~ 🔥
Hi again, I would like to ask where the entrance to the half-shread adapters is.
On the README, there are only single and multiple adapters.
Thanks in advance!