Closed buhe closed 7 months ago
https://databend.rs/doc/sql-functions/ai-functions/
Vector store support is great, but generating embedding requires the AI_EMBEDDING_VECTOR function, which is too coupled.
Only use OpenAI service.
Thanks for mentioning.
Only use OpenAI service.
Databend currently only provides support for OpenAI, but there is potential for adding other LLM models in the future.
Vector store support is great, but generating embedding requires the AI_EMBEDDING_VECTOR function, which is too coupled.
Still confused with too coupled
(Also we talked in email), here is some explains:
AI_EMBEDDING_VECTOR
is a function that returns only a vector (of ARRAY data type). This means you can store the resulting array in the Databend ARRAY data type or any other persistent ARRAY storage.
Query:select ai_embedding_vector('hi');
Result: [-0.03512698,-0.020624293,-0.015343423,-0.039803572,... <4096 items>]
Please inform me of your preferred version for decoupling
.
If embedding and storage are separate, then I understand it wrong.
let embeddings = OpenAIEmbeddings()
let s = Supabase(embeddings: embeddings)
My expectation is that embedding and storage arrays are optional.
Yes, they are separate, you can only utilize the AI_EMBEDDING_VECTOR
function to get the vector array and save it to other stores.
Thank you for your answer.
https://www.databend.com/apply/
I am at that waitlist...