ksylvest / omniai-openai

An implementation of the OmniAI interface for OpenAI.
https://rubydoc.info/github/ksylvest/omniai-openai
MIT License
1 stars 0 forks source link

Feature? adding Ollama and LocalAI namespaces #17

Closed MadBomber closed 6 days ago

MadBomber commented 1 week ago

I'm playing around with this stuff purely for introspection reasons. When I create a client for Ollama or LocalAI I'm expecting see those classes in the namespace not OpenAI.... call me silly but that's just what I'm thinking.

Do you see a downside to this technique?

module OmniAI
  module LocalAI
    class Client
      def initialize(**kwargs)
        kwargs[:host]     = 'http://localhost:8080'
        kwargs[:api_key]  = nil
        @openai_client    = OmniAI::OpenAI::Client.new(**kwargs)
      end

      # Forward all missing instance methods to the internal OpenAI client
      def method_missing(method, *args, &block)
        if @openai_client.respond_to?(method)
          @openai_client.send(method, *args, &block)
        else
          super
        end
      end

      # Forward all missing class methods to the internal OpenAI client's class
      def self.method_missing(method, *args, &block)
        if OmniAI::OpenAI::Client.respond_to?(method)
          OmniAI::OpenAI::Client.send(method, *args, &block)
        else
          super
        end
      end
    end
  end
end

module OmniAI
  module Ollama
    class Client
      def initialize(**kwargs)
        kwargs[:host]     = 'http://localhost:11434'
        kwargs[:api_key]  = nil
        @openai_client    = OmniAI::OpenAI::Client.new(**kwargs)
      end

      # Forward all missing instance methods to the internal OpenAI client
      def method_missing(method, *args, &block)
        if @openai_client.respond_to?(method)
          @openai_client.send(method, *args, &block)
        else
          super
        end
      end

      # Forward all missing class methods to the internal OpenAI client's class
      def self.method_missing(method, *args, &block)
        if OmniAI::OpenAI::Client.respond_to?(method)
          OmniAI::OpenAI::Client.send(method, *args, &block)
        else
          super
        end
      end
    end
  end
end
ksylvest commented 6 days ago

I think that pattern is fine, but it might be easier to subclass the OpenAI client instead to reduce the code? I think that'd be a fine client to publish if you are interested in pushing separate gems for each.

MadBomber commented 6 days ago

I went the sub-class route first but when I did @client.class.name it would always return the OpenAI class. There is nothing "wrong" with that approach. I'm just a little fixated on the introspection expecting that the instance will tell me its true class name.

I made minor modifications to this code to allow for the possibility that there maybe envars associated with the host and api_key using this pattern:

kwargs[:host]     ||= ENV.fetch('OLLAMA_HOST','http://localhost:11434')
kwargs[:api_key]  ||= ENV.fetch('OLLAMA_API_KEY')

For now I'm keeping these two new modules as extensions within my app. If it works out there and you agree, I will publish both as extensions within your omniai namespace.

MadBomber commented 6 days ago

Kevin, you are right. I went back and revisited the code that I wrote when I attempted the sub-class approach and I totally blew it. That appears to be the simplest and most straight forward way to do what I was attempting to do.

What I think happened to me was that I got tied up trying to protect the sub-class from having to know what the keyword parameters were outside of two "required" host and api_key.

module OmniAI
  module LocalAI
    class Client < OmniAI::OpenAI::Client
      def initialize(**kwargs)
        kwargs[:host]     ||= ENV.fetch('LOCALAI_HOST','http://localhost:8080')
        kwargs[:api_key]  ||= ENV.fetch('LOCALAI_API_KEY')
        super(**kwargs)
      end
    end
  end
end