neuml / txtai

💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
https://neuml.github.io/txtai
Apache License 2.0
8.8k stars 580 forks source link

Can't specify embedding model via API? #632

Closed scpedicini closed 3 months ago

scpedicini commented 8 months ago

I've setup a txtai with the following configuration.

# config.yml
writable: true

embeddings:
  content: true
  defaults: false

  indexes:
    minilm:
      path: "sentence-transformers/all-MiniLM-L6-v2"

    nli:
      path: "sentence-transformers/nli-mpnet-base-v2"

I'm trying to get the vec embeddings of a given document using the API but cannot figure out how to specify the desired model, e.g.

curl -X GET "http://localhost:8000/transform?text=hello&index=nli"
davidmezzetti commented 8 months ago

Thank you for the write up on this.

Currently, the Embeddings instance doesn't have support for specifying the target index to use for the transform function. This would be a straightforward change and we can leave this issue open to add it in.