Closed Zhang-Henry closed 2 years ago
Hi, fine-tuning large models would take at least 16GB of GPU memory, so 10GB would not be enough. If you really want to fine-tune with 10GB of memory, you can try reducing max_input_length
, though this would result with truncated documents. For 2, that's normal.
How can I solve the problems?
The main error message is shown below: