The first load of the ''multi-qa-distilbert-cos-v1" model doesn't encode the narratives in our dataset properly. If I do reload the model, however, encoding works fine. I replicated this on three different machines, but not sure if this is a big priority. Anyways, it's worth a check.
The first load of the ''multi-qa-distilbert-cos-v1" model doesn't encode the narratives in our dataset properly. If I do reload the model, however, encoding works fine. I replicated this on three different machines, but not sure if this is a big priority. Anyways, it's worth a check.