Open pgayvallet opened 2 months ago
ATM the task is exposed as a static export from the plugin's entrypoint:
https://github.com/elastic/kibana/blob/98aa1ab769a4b4f8c56ad3ef71f9b55e0343eb22/x-pack/plugins/inference/server/index.ts#L22
It doesn't follow the platform's principle, and forces the consumer to pass a lot of parameters (the inference API itself, a logger, and so on).
This is even problematic for traceability: e.g we should be using the plugin's logger here, not the consumer's.
We need to change the way that task is exposed, to do so from the plugin's contract.
The should also probably move the task outside of the inference plugin, either to a dedicated one, or to llmTasks
inference
llmTasks
Pinging @elastic/appex-ai-infra (Team:AI Infra)
ATM the task is exposed as a static export from the plugin's entrypoint:
https://github.com/elastic/kibana/blob/98aa1ab769a4b4f8c56ad3ef71f9b55e0343eb22/x-pack/plugins/inference/server/index.ts#L22
It doesn't follow the platform's principle, and forces the consumer to pass a lot of parameters (the inference API itself, a logger, and so on).
This is even problematic for traceability: e.g we should be using the plugin's logger here, not the consumer's.
We need to change the way that task is exposed, to do so from the plugin's contract.
The should also probably move the task outside of the
inference
plugin, either to a dedicated one, or tollmTasks