genai-impact / ecologits

🌱 EcoLogits tracks the energy consumption and environmental footprint of using generative AI models through APIs.
https://ecologits.ai/
Mozilla Public License 2.0
36 stars 3 forks source link

Add Perplexity provider #25

Closed samuelrince closed 20 hours ago

samuelrince commented 3 months ago

Description

Add perplexity.ai LLM provider.

Solution

Perplexity uses the same API as OpenAI, meaning the OpenAI python client is compatible with their service, and it only requires changing the API endpoint.

Client example: https://docs.perplexity.ai/docs/getting-started Supported models: https://docs.perplexity.ai/docs/model-cards

We need to identify when another provider is used in the case of the OpenAI client. Plus support and register the models that they provide through their API.

LucBERTON commented 2 months ago

~I created feat/add-perplexity branch to work on this topic.~

~Regarding openai tracer~

~In the models data file, I added perplexity models as provided by openai in order to leverage the openai tracer in its current state.~ ~We'll probably need to update the tracer so it can work with dynamic provider values.~

~Regarding models data~

~I flagged the perplexity models with model_architecture_not_released but not the open source ones.~ ~Parameter count of 8x7B models, parameter count is guessed to be similar as mistral-small model~

Edit : I deleted this branch since the code is old and that I could never get an API key to actually test it. I think it's safer to start from scratch later

adrienbanse commented 3 days ago

@samuelrince @LucBERTON Quick question here: why do we need to modify the tracer for another endpoint? I think that adding the supported models for the OpenAI provider would be enough

samuelrince commented 3 days ago

Yes we can keep the OpenAI tracer, we just need to detect the right provider (looking at the endpoint). Let's not mix Perplexity models with OpenAI.

PS: I would not implement it for now unless it is asked by one of our users :)