Open KevinDuffy94 opened 3 weeks ago
Currently the Service Name for OTLP is hard-coded as "text-generation-inference.server" Could an environment variable be added which could set this. Something like...
resource = Resource.create( attributes={"service.name": os.getenv("OTLP_SERVICE_NAME","text-embeddings-inference.server")} )
And similar in the main.rs file in the router.
Currently I have multiple models deployed and will be collected into a single APM so this will be problematic.
Unconfident in writing any rust code.
WDYT @OlivierDehaene? Are you open for this addition?
Seen with Olivier, sounds good! Would you like to open a PR similarly to https://github.com/huggingface/text-embeddings-inference/pull/285?
Feature request
Currently the Service Name for OTLP is hard-coded as "text-generation-inference.server" Could an environment variable be added which could set this. Something like...
resource = Resource.create( attributes={"service.name": os.getenv("OTLP_SERVICE_NAME","text-embeddings-inference.server")} )
And similar in the main.rs file in the router.
Motivation
Currently I have multiple models deployed and will be collected into a single APM so this will be problematic.
Your contribution
Unconfident in writing any rust code.