Kong / ngx_wasm_module

Nginx + WebAssembly
Apache License 2.0
98 stars 8 forks source link

Plugin hangs after on_http_request_body #575

Closed sluramod closed 1 month ago

sluramod commented 4 months ago

I've build a very simple plugin for Kong 3.6.1:

#[derive(Default)]
pub(crate) struct EventHttpContext {
    pub(crate) context_id: u32,
    pub(crate) config: Arc<Config>,
}

impl Context for EventHttpContext {}

impl HttpContext for EventHttpContext {
    fn on_http_request_headers(&mut self, _: usize, _: bool) -> Action {
        log::info!("on_http_request_headers: {} << ", self.context_id);
        Action::Continue
    }

    fn on_http_request_body(&mut self, _num_elements: usize, end_of_stream: bool) -> Action {
        log::info!("on_http_request_body: {}, end_of_stream: {} <<< ", self.context_id, end_of_stream);
        log::info!("on_http_request_body: {}, end_of_stream: {} >>> ", self.context_id, end_of_stream);
        Action::Continue
    }

    fn on_http_response_headers(&mut self, _: usize, _: bool) -> Action {
        log::info!("on_http_response_headers: {} << ", self.context_id);
        log::info!("on_http_response_headers: {} >> ", self.context_id);
        Action::Continue
    }

    fn on_http_response_body(&mut self, num_elements: usize, end_of_stream: bool) -> Action {
        log::info!("on_http_response_body: {}, end_of_stream: {} << ", self.context_id, end_of_stream);
        log::info!("on_http_response_body: {}, end_of_stream: {} >> ", self.context_id, end_of_stream);

        Action::Continue
    }

    fn on_log(&mut self) {
        log::info!("on_log: {} << ", self.context_id);
        log::info!("on_log: {} >> ", self.context_id);
    }
}

impl EventHttpContext {
}

This plugin hangs after a new instance of this context is created. Relevant logs:

2024/07/17 21:50:29 [debug] 1325#0: *10305 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055E700FAFD60, store: 000055E70002E250)
2024-07-17T21:50:29.469699779Z 2024/07/17 21:50:29 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] create_http_context_2: '1', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:29.469709064Z 2024/07/17 21:50:29 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-17T21:50:29.469712423Z 2024/07/17 21:50:29 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_request_headers: 1 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:29.542044609Z 2024/07/17 21:50:29 [debug] 1325#0: *10305 [proxy-wasm] setting next action: pwctx->action = "CONTINUE" (pwctx: 000055E700E753C0)
2024-07-17T21:50:29.542076446Z 2024/07/17 21:50:29 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_body" step in "content" phase
2024-07-17T21:50:29.542080691Z 2024/07/17 21:50:29 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_request_body: 1, end_of_stream: true <<< , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:29.542087156Z 2024/07/17 21:50:29 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_request_body: 1, end_of_stream: true >>> , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:29.542238713Z 2024/07/17 21:50:29 [debug] 1325#0: *10305 [lua] init.lua:1362: balancer(): setting address (try 1): 104.18.6.192:443
2024-07-17T21:50:29.542394095Z 2024/07/17 21:50:29 [debug] 1325#0: *10305 [lua] init.lua:1395: balancer(): enabled connection keepalive (pool=104.18.6.192|443|api.openai.com, pool_size=512, idle_timeout=60, max_requests=10000)
2024-07-17T21:50:31.074861907Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 000055E700E753C0)
2024-07-17T21:50:31.074896775Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_response_headers" step in "header_filter" phase
2024-07-17T21:50:31.074900937Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_response_headers: 1 <<  while reading response header from upstream, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.074904292Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_response_headers: 1 >>  while reading response header from upstream, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.074918673Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase
2024-07-17T21:50:31.074920856Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: false <<  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.074923260Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: false >>  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.074939919Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase
2024-07-17T21:50:31.074949496Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: true <<  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.074954318Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: true >>  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.075198080Z 172.28.0.4 - - [17/Jul/2024:21:50:31 +0000] "POST /openai/v1/chat/completions HTTP/1.1" 200 423 "-" "OpenAI/v1 PythonBindings/0.28.1" kong_request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.075243512Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_log" step in "log" phase
2024-07-17T21:50:31.075252475Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_log: 1 <<  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.075255744Z 2024/07/17 21:50:31 [info] 1325#0: *10305 [proxy-wasm]["log_filter" #1] on_log: 1 >>  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "7920203e5ba11c58f1f6e17dd59eb211"
2024-07-17T21:50:31.075258174Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_done" step in "done" phase
2024-07-17T21:50:31.075260171Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm]["log_filter" #1] filter 1/1 finalizing context
2024-07-17T21:50:31.075269125Z 2024/07/17 21:50:31 [debug] 1325#0: *10305 [proxy-wasm] "log_filter" filter freeing context #1 (1/1)
2024-07-17T21:50:31.076461054Z 2024/07/17 21:50:31 [info] 1325#0: *10305 client 172.28.0.4 closed keepalive connection
2024-07-17T21:50:33.128561737Z 2024/07/17 21:50:33 [debug] 1321#0: *10309 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055E700FAFD60, store: 000055E70002E250)
2024-07-17T21:50:33.128612123Z 2024/07/17 21:50:33 [info] 1321#0: *10309 [proxy-wasm]["log_filter" #1] create_http_context_2: '1', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "a7b0016d6cda0723fc4df8700eb19a5a"
2024-07-17T21:50:33.128618128Z 2024/07/17 21:50:33 [debug] 1321#0: *10309 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-17T21:50:33.128620549Z 
...
...
2024/07/17 21:50:40 [debug] 1322#0: *10317 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_log" step in "log" phase
2024-07-17T21:50:40.962494233Z 2024/07/17 21:50:40 [info] 1322#0: *10317 [proxy-wasm]["log_filter" #1] on_log: 1 <<  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "056b8f7fbd1eec3e16a43672a2a04626"
2024-07-17T21:50:40.962499741Z 2024/07/17 21:50:40 [info] 1322#0: *10317 [proxy-wasm]["log_filter" #1] on_log: 1 >>  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "056b8f7fbd1eec3e16a43672a2a04626"
2024-07-17T21:50:40.962514421Z 2024/07/17 21:50:40 [debug] 1322#0: *10317 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_done" step in "done" phase
2024-07-17T21:50:40.962518296Z 2024/07/17 21:50:40 [debug] 1322#0: *10317 [proxy-wasm]["log_filter" #1] filter 1/1 finalizing context
2024-07-17T21:50:40.962521794Z 2024/07/17 21:50:40 [debug] 1322#0: *10317 [proxy-wasm] "log_filter" filter freeing context #1 (1/1)
2024-07-17T21:50:40.963868924Z 2024/07/17 21:50:40 [info] 1322#0: *10317 client 172.28.0.4 closed keepalive connection
2024-07-17T21:50:50.218233088Z 2024/07/17 21:50:50 [debug] 1334#0: *10321 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055E70106B2D0, store: 000055E70002E250)
2024-07-17T21:50:50.218275687Z 2024/07/17 21:50:50 [info] 1334#0: *10321 [proxy-wasm]["log_filter" #2] create_http_context_2: '2', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "cd637736833a766d589b9f1affe3f12e"
2024-07-17T21:50:50.218283010Z 2024/07/17 21:50:50 [debug] 1334#0: *10321 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-17T21:50:50.218326138Z 2024/07/17 21:50:50 [info] 1334#0: *10321 [proxy-wasm]["log_filter" #2] on_http_request_headers: 2 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "cd637736833a766d589b9f1affe3f12e"
2024-07-17T21:50:50.218339043Z 2024/07/17 21:50:50 [debug] 1334#0: *10321 [proxy-wasm] setting next action: pwctx->action = "CONTINUE" (pwctx: 000055E700FDFB90)
2024-07-17T21:50:50.218343606Z 2024/07/17 21:50:50 [debug] 1334#0: *10321 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_request_body" step in "content" phase
2024-07-17T21:50:50.218347312Z 2024/07/17 21:50:50 [info] 1334#0: *10321 [proxy-wasm]["log_filter" #2] on_http_request_body: 2, end_of_stream: true <<< , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "cd637736833a766d589b9f1affe3f12e"
2024-07-17T21:50:50.218351241Z 2024/07/17 21:50:50 [info] 1334#0: *10321 [proxy-wasm]["log_filter" #2] on_http_request_body: 2, end_of_stream: true >>> , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "cd637736833a766d589b9f1affe3f12e"

After the last on_http_request_body the plugin 'hangs' and receives no more callbacks.

kong config file:

format_version: "1.1"

services:
  - name: demo
    url: https://api.openai.com/v1
    routes:
      - name: openai-route
        paths:
          - /openai/v1/*
        methods:
          - GET
          - PUT
          - POST
          - PATCH
          - DELETE
          - OPTIONS
          - HEAD
          - CONNECT
          - TRACE
        strip_path: true
        filter_chains:
          - filters:
              - name: log_filter
                config:
#                    upstream: "http-echo:8080"
#                    base_uri: "<ignored>"
#                    debug: true
thibaultcha commented 4 months ago

Kong 3.6.1 has an old version of the module in which I think such a bug can occur. There's issues with local bazel builds of Kong 3.6.1 so I have not reproduced, but I have tried this filter and body payloads with Kong 3.7.1 and am not experiencing any issue so far; perhaps try with 3.7.1 which has a recent version of the module.

sluramod commented 4 months ago

I've made some progress with 3.7.1. It now hangs in a more interesting way .

2024/07/18 04:11:40 [debug] 1341#0: *3217 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 000055C695420080)
2024-07-18T04:11:40.037260289Z 2024/07/18 04:11:40 [debug] 1341#0: *3217 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_response_headers" step in "header_filter" phase
2024-07-18T04:11:40.037282164Z 2024/07/18 04:11:40 [info] 1341#0: *3217 [proxy-wasm]["log_filter" #1] on_http_response_headers: 1 <<  while reading response header from upstream, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037288160Z 2024/07/18 04:11:40 [info] 1341#0: *3217 [proxy-wasm]["log_filter" #1] on_http_response_headers: 1 >>  while reading response header from upstream, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037292684Z 2024/07/18 04:11:40 [debug] 1341#0: *3217 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase
2024-07-18T04:11:40.037295747Z 2024/07/18 04:11:40 [info] 1341#0: *3217 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: false <<  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037299438Z 2024/07/18 04:11:40 [info] 1341#0: *3217 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: false >>  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037429528Z 2024/07/18 04:11:40 [debug] 1341#0: *3217 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase
2024-07-18T04:11:40.037449347Z 2024/07/18 04:11:40 [info] 1341#0: *3217 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: true <<  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037458824Z 2024/07/18 04:11:40 [info] 1341#0: *3217 [proxy-wasm]["log_filter" #1] on_http_response_body: 1, end_of_stream: true >>  while sending to client, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037863663Z 172.28.0.4 - - [18/Jul/2024:04:11:40 +0000] "POST /openai/v1/chat/completions HTTP/1.1" 200 643 "-" "OpenAI/v1 PythonBindings/0.28.1" kong_request_id: "9136aa3acf281a6be388b63df0588422"
2024-07-18T04:11:40.037904155Z 2024/07/18 04:11:40 [debug] 1341#0: *3217 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_log" step in "log" phase
...
2024/07/18 04:11:46 [debug] 1344#0: *10313 [proxy-wasm]["log_filter" #1] filter 1/1 finalizing context
2024-07-18T04:11:46.523332712Z 2024/07/18 04:11:46 [debug] 1344#0: *10313 [proxy-wasm] "log_filter" filter freeing context #1 (1/1)
2024-07-18T04:11:46.524585892Z 2024/07/18 04:11:46 [info] 1344#0: *10313 client 172.28.0.4 closed keepalive connection
2024-07-18T04:11:47.571339041Z 2024/07/18 04:11:47 [debug] 1333#0: *10317 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055C696DB3680, store: 000055C6928974B0)
2024-07-18T04:11:47.571400906Z 2024/07/18 04:11:47 [info] 1333#0: *10317 [proxy-wasm]["log_filter" #2] create_http_context_2: '2', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "aec4d97f1c2faf28d6011f5e0c7bb767"
2024-07-18T04:11:47.571417493Z 2024/07/18 04:11:47 [debug] 1333#0: *10317 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-18T04:11:47.571420356Z 2024/07/18 04:11:47 [info] 1333#0: *10317 [proxy-wasm]["log_filter" #2] on_http_request_headers: 2 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "aec4d97f1c2faf28d6011f5e0c7bb767"
2024-07-18T04:11:47.571424191Z 2024/07/18 04:11:47 [debug] 1333#0: *10317 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_request_body" step in "content" phase
2024-07-18T04:11:47.571427036Z 2024/07/18 04:11:47 [info] 1333#0: *10317 [proxy-wasm]["log_filter" #2] on_http_request_body: 2, end_of_stream: true <<< , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "aec4d97f1c2faf28d6011f5e0c7bb767"
2024-07-18T04:11:47.571430574Z 2024/07/18 04:11:47 [info] 1333#0: *10317 [proxy-wasm]["log_filter" #2] on_http_request_body: 2, end_of_stream: true >>> , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "aec4d97f1c2faf28d6011f5e0c7bb767"
2024-07-18T04:11:47.571432842Z 2024/07/18 04:11:47 [debug] 1333#0: *10317 [lua] init.lua:1394: balancer(): setting address (try 1): 104.18.7.192:443
2024-07-18T04:11:47.571434754Z 2024/07/18 04:11:47 [debug] 1333#0: *10317 [lua] init.lua:1427: balancer(): enabled connection keepalive (pool=104.18.7.192|443|api.openai.com, pool_size=512, idle_timeout=60, max_requests=10000)
2024-07-18T04:11:48.262311346Z 2024/07/18 04:11:48 [debug] 1333#0: *10317 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 000055C6975D4660)
...
2024/07/18 04:11:51 [info] 1333#0: *10317 [proxy-wasm]["log_filter" #2] on_log: 2 <<  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.7.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "aec4d97f1c2faf28d6011f5e0c7bb767"
2024-07-18T04:11:51.408863129Z 2024/07/18 04:11:51 [info] 1333#0: *10317 [proxy-wasm]["log_filter" #2] on_log: 2 >>  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.7.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "aec4d97f1c2faf28d6011f5e0c7bb767"
2024-07-18T04:11:51.408873778Z 2024/07/18 04:11:51 [debug] 1333#0: *10317 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_done" step in "done" phase
2024-07-18T04:11:51.408877611Z 2024/07/18 04:11:51 [debug] 1333#0: *10317 [proxy-wasm]["log_filter" #2] filter 1/1 finalizing context
2024-07-18T04:11:51.408880992Z 2024/07/18 04:11:51 [debug] 1333#0: *10317 [proxy-wasm] "log_filter" filter freeing context #2 (1/1)
2024-07-18T04:11:51.445411155Z 2024/07/18 04:11:51 [info] 1333#0: *10317 client 172.28.0.4 closed keepalive connection
2024-07-18T04:11:59.484522173Z 2024/07/18 04:11:59 [debug] 1337#0: *10319 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055C696DB3680, store: 000055C6928974B0)
2024-07-18T04:11:59.484554146Z 2024/07/18 04:11:59 [info] 1337#0: *10319 [proxy-wasm]["log_filter" #1] create_http_context_2: '1', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "8b829c26dadbf872dd6b052d85b6df67"
2024-07-18T04:11:59.484558720Z 2024/07/18 04:11:59 [debug] 1337#0: *10319 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-18T04:11:59.484561424Z 2024/07/18 04:11:59 [info] 1337#0: *10319 [proxy-wasm]["log_filter" #1] on_http_request_headers: 1 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "8b829c26dadbf872dd6b052d85b6df67"
2024-07-18T04:11:59.494048529Z 2024/07/18 04:11:59 [debug] 1337#0: *10319 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_body" step in "content" phase
2024-07-18T04:11:59.494082687Z 2024/07/18 04:11:59 [info] 1337#0: *10319 [proxy-wasm]["log_filter" #1] on_http_request_body: 1, end_of_stream: true <<< , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "8b829c26dadbf872dd6b052d85b6df67"
2024-07-18T04:11:59.494089117Z 2024/07/18 04:11:59 [info] 1337#0: *10319 [proxy-wasm]["log_filter" #1] on_http_request_body: 1, end_of_stream: true >>> , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "8b829c26dadbf872dd6b052d85b6df67"
2024-07-18T04:11:59.494106412Z 2024/07/18 04:11:59 [debug] 1337#0: *10319 [lua] init.lua:1394: balancer(): setting address (try 1): 104.18.6.192:443
2024-07-18T04:11:59.494109383Z 2024/07/18 04:11:59 [debug] 1337#0: *10319 [lua] init.lua:1427: balancer(): enabled connection keepalive (pool=104.18.6.192|443|api.openai.com, pool_size=512, idle_timeout=60, max_requests=10000)
2024-07-18T04:11:59.936487796Z 2024/07/18 04:11:59 [debug] 1337#0: *10319 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 000055C69619E920)
...
2024/07/18 04:12:06 [debug] 1337#0: *10319 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_log" step in "log" phase
2024-07-18T04:12:06.245789662Z 2024/07/18 04:12:06 [info] 1337#0: *10319 [proxy-wasm]["log_filter" #1] on_log: 1 <<  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "8b829c26dadbf872dd6b052d85b6df67"
2024-07-18T04:12:06.245811697Z 2024/07/18 04:12:06 [info] 1337#0: *10319 [proxy-wasm]["log_filter" #1] on_log: 1 >>  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "8b829c26dadbf872dd6b052d85b6df67"
2024-07-18T04:12:06.245816854Z 2024/07/18 04:12:06 [debug] 1337#0: *10319 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_done" step in "done" phase
2024-07-18T04:12:06.245820449Z 2024/07/18 04:12:06 [debug] 1337#0: *10319 [proxy-wasm]["log_filter" #1] filter 1/1 finalizing context
2024-07-18T04:12:06.245824024Z 2024/07/18 04:12:06 [debug] 1337#0: *10319 [proxy-wasm] "log_filter" filter freeing context #1 (1/1)
2024-07-18T04:12:06.257153615Z 2024/07/18 04:12:06 [info] 1337#0: *10319 client 172.28.0.4 closed keepalive connection
2024-07-18T04:13:04.405819958Z 2024/07/18 04:13:04 [debug] 1333#0: *10323 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055C696DB3680, store: 000055C6928974B0)
2024-07-18T04:13:04.405883482Z 2024/07/18 04:13:04 [info] 1333#0: *10323 [proxy-wasm]["log_filter" #3] create_http_context_2: '3', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "93b2942d6ed2ac3dc6392d1489dc5b0c"
2024-07-18T04:13:04.405893973Z 2024/07/18 04:13:04 [debug] 1333#0: *10323 [proxy-wasm]["log_filter" #3] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-18T04:13:04.405897683Z 2024/07/18 04:13:04 [info] 1333#0: *10323 [proxy-wasm]["log_filter" #3] on_http_request_headers: 3 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "93b2942d6ed2ac3dc6392d1489dc5b0c"
2024-07-18T04:13:04.407065904Z 2024/07/18 04:13:04 [debug] 1333#0: *10323 [proxy-wasm]["log_filter" #3] filter 1/1 resuming "on_request_body" step in "content" phase
2024-07-18T04:13:04.407087118Z 2024/07/18 04:13:04 [info] 1333#0: *10323 [proxy-wasm]["log_filter" #3] on_http_request_body: 3, end_of_stream: true <<< , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "93b2942d6ed2ac3dc6392d1489dc5b0c"
2024-07-18T04:13:04.407091919Z 2024/07/18 04:13:04 [info] 1333#0: *10323 [proxy-wasm]["log_filter" #3] on_http_request_body: 3, end_of_stream: true >>> , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "93b2942d6ed2ac3dc6392d1489dc5b0c"

Then Nothing related to wasm beyond the last line, for awhile, until it resumes on context #1 (context #3 still hangs):

2024-07-18T04:21:38.109153473Z 2024/07/18 04:21:38 [debug] 1337#0: *647 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0069442387429013, runable_jobs_avg: 1.8559322033898, alive_threads_avg: 275.07796610169
2024-07-18T04:21:38.253142821Z 2024/07/18 04:21:38 [debug] 1336#0: *1291 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0065527453434291, runable_jobs_avg: 1.735593220339, alive_threads_avg: 275.07796610169
2024-07-18T04:21:38.472438985Z 2024/07/18 04:21:38 [debug] 1346#0: *3 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0064558592558772, runable_jobs_avg: 1.7152542372881, alive_threads_avg: 275.07796610169
2024-07-18T04:21:38.558231920Z 2024/07/18 04:21:38 [debug] 1344#0: *1933 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0078774787173895, runable_jobs_avg: 2.0757575757576, alive_threads_avg: 274.94949494949
2024-07-18T04:21:38.575622120Z 2024/07/18 04:21:38 [debug] 1341#0: *3218 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0068205853310109, runable_jobs_avg: 1.8237288135593, alive_threads_avg: 275.07796610169
2024-07-18T04:21:38.595310419Z 2024/07/18 04:21:38 [debug] 1335#0: *2575 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0068449955395552, runable_jobs_avg: 1.8274111675127, alive_threads_avg: 275.04568527919
2024-07-18T04:21:38.901230404Z 2024/07/18 04:21:38 [debug] 1333#0: *4509 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0082622850492273, runable_jobs_avg: 2.1925675675676, alive_threads_avg: 275.01351351351
2024-07-18T04:21:38.998211332Z 2024/07/18 04:21:38 [debug] 1342#0: *5152 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0073559626727898, runable_jobs_avg: 1.9610829103215, alive_threads_avg: 275.04568527919
2024-07-18T04:21:39.039230953Z 2024/07/18 04:21:39 [debug] 1332#0: *6446 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0069264156765128, runable_jobs_avg: 1.8508474576271, alive_threads_avg: 275.07796610169
2024-07-18T04:21:39.100532217Z 2024/07/18 04:21:39 [debug] 1339#0: *5794 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0068122863383369, runable_jobs_avg: 1.8220338983051, alive_threads_avg: 275.07796610169
2024-07-18T04:21:39.153580692Z 2024/07/18 04:21:39 [debug] 1345#0: *3865 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0067141950492858, runable_jobs_avg: 1.7932203389831, alive_threads_avg: 275.07796610169
2024-07-18T04:21:39.316983112Z 2024/07/18 04:21:39 [debug] 1343#0: *7097 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0069908744975868, runable_jobs_avg: 1.8641765704584, alive_threads_avg: 275.1103565365
2024-07-18T04:21:39.423484769Z 2024/07/18 04:21:39 [debug] 1338#0: *7092 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0069724047950485, runable_jobs_avg: 1.8593220338983, alive_threads_avg: 275.07796610169
2024-07-18T04:21:39.843343287Z 2024/07/18 04:21:39 [debug] 1334#0: *8378 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0068271042358349, runable_jobs_avg: 1.8254237288136, alive_threads_avg: 275.07796610169
2024-07-18T04:21:39.849188066Z 2024/07/18 04:21:39 [debug] 1347#0: *9022 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0069341689155306, runable_jobs_avg: 1.8494077834179, alive_threads_avg: 275.04568527919
2024-07-18T04:21:40.936324624Z 2024/07/18 04:21:40 [debug] 1340#0: *9666 [lua] super.lua:161: scaling_log(): [timer-ng] load_avg: 0.0077561348723302, runable_jobs_avg: 2.0420875420875, alive_threads_avg: 274.94949494949
2024-07-18T04:23:08.761358130Z 2024/07/18 04:23:08 [debug] 1343#0: *10341 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055C697715E50, store: 000055C6928974B0)
2024-07-18T04:23:08.761389964Z 2024/07/18 04:23:08 [info] 1343#0: *10341 [proxy-wasm]["log_filter" #1] create_http_context_2: '1', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "bb517f237a4cfd25b3649704c720653e"
2024-07-18T04:23:08.761395346Z 2024/07/18 04:23:08 [debug] 1343#0: *10341 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-18T04:23:08.761398343Z 2024/07/18 04:23:08 [info] 1343#0: *10341 [proxy-wasm]["log_filter" #1] on_http_request_headers: 1 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "bb517f237a4cfd25b3649704c720653e"
2024-07-18T04:23:08.876612333Z 2024/07/18 04:23:08 [debug] 1343#0: *10341 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_request_body" step in "content" phase
...

Then it eventually hangs on context #2:

2024/07/18 04:23:31 [info] 1347#0: *10363 [proxy-wasm]["log_filter" #1] on_log: 1 <<  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "72ee7c2888c28b78b047f359377b5f79"
2024-07-18T04:23:31.467059789Z 2024/07/18 04:23:31 [info] 1347#0: *10363 [proxy-wasm]["log_filter" #1] on_log: 1 >>  while logging request, client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", upstream: "https://104.18.6.192:443/v1/chat/completions", host: "kong-wasm:8000", request_id: "72ee7c2888c28b78b047f359377b5f79"
2024-07-18T04:23:31.467063487Z 2024/07/18 04:23:31 [debug] 1347#0: *10363 [proxy-wasm]["log_filter" #1] filter 1/1 resuming "on_done" step in "done" phase
2024-07-18T04:23:31.467066370Z 2024/07/18 04:23:31 [debug] 1347#0: *10363 [proxy-wasm]["log_filter" #1] filter 1/1 finalizing context
2024-07-18T04:23:31.467069181Z 2024/07/18 04:23:31 [debug] 1347#0: *10363 [proxy-wasm] "log_filter" filter freeing context #1 (1/1)
2024-07-18T04:23:31.467081658Z 172.28.0.4 - - [18/Jul/2024:04:23:31 +0000] "POST /openai/v1/chat/completions HTTP/1.1" 200 118564 "-" "OpenAI/v1 PythonBindings/0.28.1" kong_request_id: "72ee7c2888c28b78b047f359377b5f79"
2024-07-18T04:23:31.477568781Z 2024/07/18 04:23:31 [info] 1347#0: *10363 client 172.28.0.4 closed keepalive connection
2024-07-18T04:23:31.619426844Z 2024/07/18 04:23:31 [debug] 1341#0: *10367 [proxy-wasm] "log_filter" filter reusing instance (ictx: 000055C697715E50, store: 000055C6928974B0)
2024-07-18T04:23:31.619455832Z 2024/07/18 04:23:31 [info] 1341#0: *10367 [proxy-wasm]["log_filter" #2] create_http_context_2: '2', client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "e12ef249f49fef61b924ac637d7a905a"
2024-07-18T04:23:31.619460842Z 2024/07/18 04:23:31 [debug] 1341#0: *10367 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024-07-18T04:23:31.619464145Z 2024/07/18 04:23:31 [info] 1341#0: *10367 [proxy-wasm]["log_filter" #2] on_http_request_headers: 2 << , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "e12ef249f49fef61b924ac637d7a905a"
2024-07-18T04:23:31.621797678Z 2024/07/18 04:23:31 [debug] 1341#0: *10367 [proxy-wasm]["log_filter" #2] filter 1/1 resuming "on_request_body" step in "content" phase
2024-07-18T04:23:31.621825808Z 2024/07/18 04:23:31 [info] 1341#0: *10367 [proxy-wasm]["log_filter" #2] on_http_request_body: 2, end_of_stream: true <<< , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "e12ef249f49fef61b924ac637d7a905a"
2024-07-18T04:23:31.621831175Z 2024/07/18 04:23:31 [info] 1341#0: *10367 [proxy-wasm]["log_filter" #2] on_http_request_body: 2, end_of_stream: true >>> , client: 172.28.0.4, server: kong, request: "POST /openai/v1/chat/completions HTTP/1.1", host: "kong-wasm:8000", request_id: "e12ef249f49fef61b924ac637d7a905a"

Btw, I'm using this project on github for testing: reworkd/AgentGPT

thibaultcha commented 4 months ago

Could we see the full logs? It is worth noting that each worker has its own context_id counter, so context #1 in pid 1341 is not the same as context #1 in pid 1347. Mostly I wonder what happens on connection *10323 after the last callback invocation so the full logs may be insightful.

sluramod commented 4 months ago

Unfortunately, today 3.7.1 is behaving exactly like 3.6.1 before. I've attached a few sample runs. proxy_wasm_traces.txt

thibaultcha commented 4 months ago

Are there no other logs after the last line of this file? What if you send another request while it hangs? Are there more logs that appear?

sluramod commented 4 months ago

Nothing after that. FYI, in the attached file you can see several separate traces, they are separated by multiple new lines. I restarted kong to get each trace. Also FYI, kong is running in docker.

thibaultcha commented 4 months ago

So far I have not been able to reproduce this in either of the Kong 3.6.1 or 3.7.1 images. I've been running the above filter in those docker images and running wrk with request bodies and have not encountered any issue. Since I don't have an OpenAI key (no phone number to give them), I tested with another upstream (https://httpbin.org). Could you try with another upstream as well and see if the problem persists?

sluramod commented 4 months ago

My goal was to test with a real application connected to openai api, I don't have a test suite ready to verify with httpbin.

However, I did deploy exactly the same plugin to envoy docker container, configured the app to proxy requests through envoy container instead of kong container and made no other changes. The exact same plugin works in envoy without hanging up. Then I stopped envoy container, started kong container, and kong with wasm plugin stopped responding after few request/responses.

I will try with another upstream later.

Perhaps openai service definition in kong requires some special configuration? I shared the config I'm using above.

thibaultcha commented 4 months ago

Nothing seems wrong with the configuration. Could we also see the request the client is making? Headers and body - so we can try to spot anything that could trigger this.

thibaultcha commented 1 month ago

Unable to reproduce and lack of activity.