kitlangton / automaton

RPC + GPT
9 stars 3 forks source link

Usage question: how to use behind a proxy? #4

Open nemoo opened 1 year ago

nemoo commented 1 year ago

I had asked for help here: https://github.com/zio/zio-openai/issues/31 and followed the solution to provide my own custom ClientConfig for zio http. But now I think the config issue is maybe more automaton specific, so I am opening this question here.

I am just trying to use the automaton behind a proxy.

When I try this:

  val run = {
    val url: URL = URL.fromString("http://10.0.2.2:3128")
      .toOption
      .getOrElse(URL.empty)
    println(s"url: "  + url )
    val proxy = Proxy(url)
    ChatService
      .prompt[ActionService](
        "Slack me a poem in the style of ee cummings about the weather. Also, what's 2156 + 2366."
      )
      .provide(
        ChatService.live[ActionService],
        ActionService.layer,
        ClientConfig.live(
          ClientConfig().proxy(proxy)
        ),
        Client.fromConfig
      )
  }

It still simply does not use the proxy.

timestamp=2023-04-12T08:48:48.974431188Z level=ERROR thread=#zio-fiber-0 message="" cause="Exception in thread "zio-fiber-4" java.lang.Exception: Unknown(io.netty.channel.ConnectTimeoutException: connection timed out: api.openai.com/104.18.7.192:443)

Any ideas how I could configure the proxy otherwise?

nemoo commented 1 year ago

now I came up with this approach to change the ChatService.live method to use the proxy:

    ClientConfig.live(
      ClientConfig(proxy = 
        Some(
          Proxy(URL.fromString("http://10.0.2.2:3128").toOption.get)))) >>>
    ZClient.fromConfig >>> Chat.live >+>
    ZLayer(Ref.make(NonEmptyChunk(ChatCompletionRequestMessage(Role.System, instructions)))) >>>
      ZLayer.fromFunction(ChatService.apply[Service](_, _, _, handler))

This compiles, but the proxy is still not used.