Closed DwyaneLQY closed 1 year ago
Thanks for your interest! If you resample your own Shakespeare subset, there is a chance you will get some examples that are too long for your GPU. Here are some strategies I would use to mitigate this problem:
sample_generate
N
times, split the task into several chunks so they don't all hit your GPU at the same time. You may need to slightly modify the codeLet me know if this issue persists for you.
Great! I really appreciate your response. I carefully considered your advice and managed to successfully resolve the issue. Thank you once again for your assistance.
You guys are doing great work, and I really appreciate it! I'm currently going through the code, but I encountered an issue when attempting to run a few-shot TST experiment on the shakespeare dataset. Specifically, I encountered the Cuda out of memory problem on RTX 4090 24G.
I'm using my own bert-base style classifier (I noticed that the model you provided is also a bert-base model) and I have resampled a portion of the data from the original dataset for training. All other settings are set to default values in the code.
I'm a bit confused about why I'm exceeding the memory limit. Could it be due to some of the sentences in the shakespeare dataset being too long? I would greatly appreciate it if you could help answer my questions. Thank you!