genai-impact / ecologits

🌱 EcoLogits tracks the energy consumption and environmental footprint of using generative AI models through APIs.
https://ecologits.ai/
Mozilla Public License 2.0
63 stars 8 forks source link

This package should work with most LLM clients #7

Closed LucBERTON closed 5 months ago

LucBERTON commented 7 months ago

Description

This package should work with most LLM clients :

Solution

The first versions of this package may focus on OpenAI client only but, future versions should allow all features to be used with any LLM clients.

Considerations

N/A

Additional context

N/A

LucBERTON commented 7 months ago

Based on Anyscale documentation, I think they do not have their own client but only using OpenAI client : https://docs.anyscale.com/endpoints/model-serving/openai-migration-guide

LucBERTON commented 7 months ago

Anthropic client documentation : https://docs.anthropic.com/claude/reference/client-sdks

LucBERTON commented 7 months ago

I started working for the Anthropic client on this branch : /feat/anthropic-sdk

I only added Anthropic dependency in the project and created the anthropic wrapper file.

samuelrince commented 7 months ago

Excellent @LucBERTON I'll test today or tomorrow! Fyi, Anthropic gives you $5 credit when you create a new account to test.

LucBERTON commented 7 months ago

I think I managed to make a working wrapper for Anthropic library.

I tested it like so :

# from anthropic import Anthropic
from client.anthropic_wrapper import Anthropic

api_key = <your_api_key>

client = Anthropic(
    # defaults to os.environ.get("ANTHROPIC_API_KEY")
    api_key=api_key,
)
message = client.messages.create(
    model="claude-3-opus-20240229",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hello, Claude"}
    ]
)
print(message.content)
print(message.impacts)

Outputs :

[ContentBlock(text="Hello! It's nice to meet you. How can I assist you today?", type='text')]
energy=0.15561 energy_unit='Wh'
aqwvinh commented 6 months ago

Hello @LucBERTON and @samuelrince , I can try working on transformers from HuggingFace, I'll do it on the branch /feat/transformers-sdk if that's ok with you

I'll try to work on text generation, cf the documentation, please redirect me if that's the wrong one :)

samuelrince commented 6 months ago

Hey @aqwvinh, we definitely have to support HF for sure! But I am not clear on how we should integrate the transformers lib into the package.

The transformers lib is made to run models locally, so in that use case, I would definitely recommend using codecarbon instead of genai_impact to get real energy consumption metrics. Though I can see one use case where you would prefer using genai_impact over codecarbon, that's when the software sensors for energy consumption are not available. That is the case when either the "sensor API" is not supported by codecarbon OR the API is just disabled or non-existent (specific hardware, cloud instances, etc.)

Another thing we can consider for HuggingFace is to support their Inference Endpoints. I never personally used it, but I think that in Python you either query the API directly or use the huggingface_hub package.

aqwvinh commented 6 months ago

Hey @samuelrince , good to me! I've never used Inference Endpoints. I took a quick look to the documentation and I found this. I can try working on the huggingface_hub then