traceloop / openllmetry-ruby

Sister project to OpenLLMetry, but in Ruby. Open-source observability for your LLM application, based on OpenTelemetry
Apache License 2.0
5 stars 1 forks source link

Add support for Gemini #7

Closed weilandia closed 3 months ago

weilandia commented 3 months ago

Here's a jenky half baked solution we're using in the context of our own Llm::Chat::Gemini::ChatResponse:

     def response_for_log
        {
          "model" => model,
          "choices" => [
            {
              "message" => {
                "role" => "model",
                "content" => response.candidates.pluck("content").pluck("parts").flatten.pluck("text").join(" ")
              }
            }
          ],
          "usage" => {
            "total_tokens" => input_tokens + output_tokens,
            "prompt_tokens" => input_tokens,
            "completion_tokens" => output_tokens
          }
        }
      end

      def log_response
        require "traceloop/sdk"

        traceloop = Traceloop::SDK::Traceloop.new
        tracer = traceloop.instance_variable_get(:@tracer)

        tracer.in_span(model) do |span|
          t = traceloop.class::Tracer.new(span)
          t.log_prompt(model, prompt)
          t.log_response(response_for_log)
        end
      end
weilandia commented 3 months ago

Further, here's some of the weirdness we're considering to get to the interface we want:

# frozen_string_literal: true

require "traceloop/sdk"

class TraceloopClient
  include Singleton

  attr_accessor :conn, :spanner

  def initialize
    self.conn = Traceloop::SDK::Traceloop.new
    self.spanner = conn.instance_variable_get(:@tracer)
  end

  def self.trace(name)
    instance.spanner.in_span(name) do |span|
      tracer = Traceloop::SDK::Traceloop::Tracer.new(span)
      yield(tracer)
    end
  end
end
nirga commented 3 months ago

Fixed with #8