Open SamYuen101234 opened 1 month ago
Hi @SamYuen101234, thanks for raising an issue!
This is a question best placed in our forums. We try to reserve the github issues for feature requests and bug reports.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
I am trying to implement a custom compute metric for trainer. The logits and labels are numpy array of the full evaluation data, however, my evaluation data input has the size (1000, 43, 50257). The computation can't be done in a 24GB L4 GPU on colab. Any way to load the data in mini batch like using dataloader instead of given a full numpy array.
`# eval_pred is all the valid data not only the mini-batch def compute_metrics(eval_pred): accuracy_metric = load_metric("accuracy") logits, labels = eval_pred