Alrope123 / rethinking-demonstrations

167 stars 16 forks source link

Out-of-memory #10

Open myhakureimu opened 11 months ago

myhakureimu commented 11 months ago

I tested the following code on Google Colab with an A100 (40G) but got out of memory, may I ask for how many memory does the following code needs?

!python test.py --gpt2 channel-metaicl --method direct --out_dir out/gpt2-large --do_zeroshot --use_demonstrations --k 16 --seed 100,13,21,42,87 --dataset glue-wnli_random

Alrope123 commented 3 months ago

Hi there. Meta-ICL models are 774M models, which an A100 (40G) should suffice. Try decrease the default batch size (64) using the flag --test_batch_size (link to the code).