Closed kristapsdz-saic closed 5 days ago
Hi @kristapsdz-saic , I encountered the same issue, which seems to be caused by the model output structure. The model produces a 2D array as output. I attempted to use the Reranker model(refer below), and it worked for me without any errors. However, further debugging is required to resolve the issue completely.
ranker = TransformersSimilarityRanker(model="BAAI/bge-reranker-large")
Hey @kristapsdz-saic , @nvenkat94 is correct, this component only supports models with a Cross-Encoder architecture (which is the same as SequenceClassification
in HuggingFace terms). Typically, models with reranker
or cross-encoder
in their name use this architecture and will be supported by this component.
The model provided in your original example "sentence-transformers/all-MiniLM-L6-v2"
is an embedding model (or Bi-Encoder), which is not supported by this component. There is a nice explanation of the difference between Bi-Encoders and Cross-Encoders from Sentence Transformers here.
Describe the bug The ranker fails regardless of its input.
Error message The following code is used to trigger the output, although any exemplar invocation will do the trick:
When executed:
When examined, the
i
value in this code is assigned to a list, withsorted_indices
, from whichi
is assigned, being a list of lists.Expected behavior That
i
in the file would be a scalar.Additional context Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.
To Reproduce
Run an invocation of the ranker with the following
Pipfile
:FAQ Check
System: