Closed apethree closed 1 year ago
I have made progress. The shapes of the arrays are:
p.predictions of the type:
p.predictions[0] -->. Array of size (1200,2) p.predictions[1] --> List [0], [1] Where [0] ---> (1200,768) Where [1] ---> (1200,2)
What is the 768 wide row? Not sure how to use that in prediction and F1 score.
This the probably the correct way to calculate argmax():
pred_labels = np.argmax(p.predictions[0], axis=1)
pred_scores = softmax(p.predictions[0], axis=1)[:, 1]
Instead of
pred_labels = np.argmax(p.predictions, axis=1)
pred_scores = softmax(p.predictions, axis=1)[:, 1]
Hey @apethree, you've hit the solution there. Due to a change in how EvalPrediction
works, the calc_classification_metrics
is bugged. The variable with the 768 refers to the embeddings before the final layer. Just use p.predictions[0]
instead and you should be good.
Steps to reproduce: Notebook: https://github.com/georgian-io/Multimodal-Toolkit/tree/master/notebooks Run this notebook on evuluation phase you will see this error:
There is some issue with the way the numpy is handling the argmax() function. With axis = 0 the error is
With axis = 1 the error is
The shape of the numpy array is (2,) so I am not sure how to resolve this issue.
------ Full evaluation error on train.evaluation() ----------