feast-dev / feast

The Open Source Feature Store for Machine Learning
https://feast.dev
Apache License 2.0
5.59k stars 998 forks source link

Mlops architecture to reconcile feast and vector search in the serving layer #3965

Open boumelhaa opened 8 months ago

boumelhaa commented 8 months ago

This is not an issue. We are currently working on developing a scalable architecture for our ranking system using Feast. As a backend, we are utilizing GCP for the offline store and Redis for the online store in AWS, which is in close proximity to our serving environment.

Feast effectively abstracts the feature vectors for classical models or batch inference. However, the complexity arises when we integrate embeddings into our recommendation system.

While Feast serves well for training the embeddings model and encoding the embeddings in offline batches, the challenge lies in serving these embeddings. My question pertains to how a vector search solution fits into our architecture. Where should the embeddings reside, and do we need to register them initially?

In essence, considering our two-tower ranking model where the first tower's embeddings are encoded offline and the second tower's embeddings are encoded upon request, followed by a search against the pre-encoded embeddings, how can we structure this using Feast?

HaoXuAI commented 8 months ago

This is a feature we are planning to add to feast. On a in-mature thought we can add an API that can both index and retrieve the embedding. probably can use Faiss as a stab. Will initiate a RFC for the work.

boumelhaa commented 7 months ago

Yeah, Faiss seems like a promising candidate, although the in-memory aspect can pose scalability and re-indexing challenges. Integrating it with Elasticsearch would enhance its appeal, allowing for the registration of embeddings and direct materialization into ES or any other search API, offering an alternative to traditional online store technologies.

HaoXuAI commented 7 months ago

@boumelhaa for the online store with serach functionality, are you using the elasticsearch?

tokoko commented 7 months ago

Don't really know much about the subject, but feels like people might want to use different technologies for normal feature lookup and vector search. Is it the right time to also start thinking about (better) supporting configuration of multiple online stores in the same feast project?

HaoXuAI commented 7 months ago

Don't really know much about the subject, but feels like people might want to use different technologies for normal feature lookup and vector search. Is it the right time to also start thinking about (better) supporting configuration of multiple online stores in the same feast project?

Not sure how that would help the search use case but yeah definitely a good feature to have.

franciscojavierarceo commented 7 months ago

This is a great topic and something I was actually quite excited about supporting. Glad to see @HaoXuAI already on it! 🚀

boumelhaa commented 7 months ago

@boumelhaa for the online store with serach functionality, are you using the elasticsearch?

I have employed Elasticsearch in an initiative aimed at optimizing vector search and experimenting with its Approximate Nearest Neighbors (ANN) functionality. Locally, it demonstrated superb speed in searching using cosine similarity. Additionally, ES can be readily managed by AWS (open search) and other cloud providers.

From my perspective, if we opt for ES or any other search technologie, we'll need to abstract away all the tedious aspects of indexing the vectors and searching through them, as well as implementing methods to retrieve the k most similar vectors using multiple algorithms (brute force, ANN, etc.) just like how faiss does it.

In the feature_store.yaml, I believe it would be prudent to distinguish it from both the offline and online stores, optionally adding it on top of them, as not all use cases will require search functionality

franciscojavierarceo commented 1 week ago

Just want to close this out, we ended up implementing ElastiSearch for the Feast VectorDB work. @boumelhaa did you see this?