Open ulan-yisaev opened 3 months ago
Hi @ulan-yisaev, thank you for your support and information.
We will soon support the use of the Azure API for LLMLingua. Afterwards, we can extend the corresponding endpoint to other platforms. Thank you once again for your information.
Hi @iofu728, thank you for your response and the update on supporting Azure API for LLMLingua. To prepare for extending LLMLingua to other platforms, could you please share - if exists - any examples or templates for implementing web-based LLM APIs with LLMLingua? This information would be incredibly helpful as I'm eager to start integrating Aleph Alpha's API in a similar manner for my project.
Describe the issue
I propose integrating Aleph Alpha's Luminous models through their API into LLMLingua to address performance concerns with local model execution. Aleph Alpha supports returning logprobs for both prompts and generated text:
https://docs.aleph-alpha.com/api/complete/
This integration could significantly speed up executions compared to running default models locally, aligning with similar interests shown in issue #70