evidentlyai / evidently

Evaluate and monitor ML models from validation to production. Join our Discord: https://discord.com/invite/xZjKRaNp8b
Apache License 2.0
4.85k stars 543 forks source link

Passing Bearer token when connecting to remote workspace #1152

Open m-blasiak opened 3 weeks ago

m-blasiak commented 3 weeks ago

I am trying to setup a self hosted ML monitoring with Evidently.

I have Evidently running inside a container on a remote host. The host requires an Authorization header with a Bearer token.

According to the documentation, passing a secret is possible via:

However, it looks like Evidently passes the secret value as evidently-secret header instead of an Authorization header. (see here and here)

Would it be possible to allow users to specify the name of the header the secret should be attached to?

DimaAmega commented 2 weeks ago

Hi

We are using "evidently-secret" header in order to protect write api. You should never use this functionality for authorization.

But, of course, you can have your own proxy authorization layer just before running "evidently ui" (i think modifying evidently source code is bad idea for your purposes). This layer just check token, if ok - proxy request to running "evidently ui" server

Is it helpful for you?

m-blasiak commented 2 weeks ago

Thanks for the reply @DimaAmega, perhaps the Documentation on this could be clarified a little bit. Right now it simply states:

image

It is a bit ambiguous and I initially assumed this was basically the auth token required to access the remote server. MLFlow has done something similar (see https://mlflow.org/docs/latest/auth/index.html#using-environment-variables).

I understand this is unfortunately not possible with Evidently. We can work around it and secure the service in a different way, but more clarity in the documentation would be appreciated.