Open BaiMoHan opened 2 months ago
That sounds useful, I will look into it soon. I can try and implement it on our side if possible, otherwise middleware should work fine for now.
That sounds useful, I will look into it soon. I can try and implement it on our side if possible, otherwise middleware should work fine for now.
I’m glad to hear that you are considering this. I will first try modifying the source code using the middleware approach to support my needs. If you have any other new ideas, please let me know. I look forward to hearing from you.
Maybe the user
field parameter should be logged.
I've noticed that the logs currently record information regarding sample parameters besides the prompt. What I really need is the ability to log a trace_id for each request. My use case involves scenarios where network issues or other factors might disrupt communication, and I'm not certain if the request has truly reached my inference engine node. To address this, I would like to pass a trace_id along with each request.
Currently, I'm considering adding a middleware to the api_server to achieve this. Do you have any better suggestions or recommendations on how to implement this feature more effectively?