Open Manouchehri opened 3 months ago
See https://github.com/openai/openai-node/issues/392#issuecomment-1779417929 for more info.
https://github.com/openai/openai-node?tab=readme-ov-file#customizing-the-fetch-client
gotcha that's an easy implementation, thanks!
Thanks! Should I be seeing this working in https://github.com/danny-avila/LibreChat/pkgs/container/librechat-dev-api/229754456?tag=4416f69a9b299c03dd947e52a033af202672597d? It looks like requests are still HTTP/1.1 for me, but I might be doing something wrong. :)
@Manouchehri I see, there's a little more to configure to enable h2 by default
https://github.com/nodejs/undici/issues/399 https://github.com/nodejs/undici/issues/2750
What features would you like to see added?
Should hopefully improve perf slightly. Right now requests to my LiteLLM (aka custom OpenAI API) endpoint are all HTTP/1.1 from what I see. =)
More details
See https://github.com/BerriAI/litellm/issues/3533#issue-2286515074 for example code in Python. Not entirely sure how to do it with node.
Which components are impacted by your request?
General, Endpoints
Pictures
No response
Code of Conduct