feast-dev / feast

The Open Source Feature Store for Machine Learning
https://feast.dev
Apache License 2.0
5.41k stars 962 forks source link

Add Open Inference Protocol to feature servers #4335

Open tokoko opened 2 weeks ago

tokoko commented 2 weeks ago

Is your feature request related to a problem? Please describe. The goal of this feature is to simplify feast integration for model serving platforms. Feast feature servers have custom http/grpc interfaces which can be tricky to integrate with. For example, Kserve's feast integration guide involves writing glue code to create a transformer and publishing a custom docker image for each model integration.

Describe the solution you'd like This feature request proposes adding OIP-compliant endpoints to feast feature servers. While open inference protocol is modeled for model execution and it's lingo also reflects that (/infer endpoint for example), I think it still might be worth it to reuse the same protocol to simplify integration, especially for serving requests involving FeatureServices that are very similar to models from a user perspective:

This request proposes adding new OIP-compliant endpoints to the feature server leaving existing ones intact. This is important because existence of a /get-online-features endpoint will allow us not to support all feast features under OIP. For example, feature retrieval passing a list of features instead of a FeatureService should be only supported with /get-online-features as it doesn't fit OIP protocol nicely. Passing back metadata about row staleness along with data payload can also be something that's /get-online-features-only unless we find a good way to model it with OIP.

Finally, to disambiguate this proposal from #4288 and the associated discussion, this feature request doesn't call for introducing model serving into feast, everything that will happen on a feature server will be very much still feature retrieval only disguised as OIP to simplify integration.

zoramt commented 2 weeks ago

This is an interesting proposal. Do you already have some design documents folks can study? For instance, how would one replace the Kserve transformer based example to one using the feature store's OIP endpoints?

franciscojavierarceo commented 2 weeks ago

@zoramt my comment here kind of outlines how the feature store could handle things: https://github.com/feast-dev/feast/issues/4288#issuecomment-2214502711

And this PR shows an example of how it could work in Feast: https://github.com/feast-dev/feast/pull/4282

We would swap the On Demand Feature View for a call to a KServe endpoint.

tokoko commented 2 weeks ago

This is an interesting proposal. Do you already have some design documents folks can study? For instance, how would one replace the Kserve transformer based example to one using the feature store's OIP endpoints?

@zoramt I don't have a full demo yet, at least not the kserve side. The intention right now is to get the initial feedback whether it's an idea worth pursuing or not, but I plan to start working on it soon. The idea is to treat feast feature servers as yet another one of model serving runtimes in kserve that can host multiple FeatureServices and then use kserve Inference Graph to chain two model invocations in a sequence (first being feature retrieval masquerading as a model, the second being the model itself).