Kong / kong

🦍 The Cloud-Native API Gateway and AI Gateway.
https://konghq.com/install/#kong-community
Apache License 2.0
38.99k stars 4.78k forks source link

May I ask that is there any configuration or solution which could fetch and log the request body in log file? #6877

Closed quzhixue-Kimi closed 3 years ago

quzhixue-Kimi commented 3 years ago

Hi there,

The kong-ingress-controller and kong have been deployed to my K8S cluster via kubectl apply -f https://raw.githubusercontent.com/Kong/kubernetes-ingress-controller/master/deploy/single/all-in-one-dbless.yaml as the first step.

And, there is a plugin named file log via https://docs.konghq.com/hub/kong-inc/file-log/ installed as the global plugin, also.

The final step is that my sample application deployed to the K8S cluster.

When I would like to test to POST an endpoint of my sample application via:

curl -X POST -H "Content-Type: application/json" -d '{"user_id": "123", "coin":100, "success":1, "msg":"OK!" }' localhost/sayHello1 | jq .

The result of file-log plugin is

{ "latencies": { "request": 27, "kong": 1, "proxy": 26 }, "service": { "host": "httpbin.default.80.svc", "created_at": 1614584110, "connect_timeout": 60000, "id": "764d9f4d-c24c-5709-991c-19a96a581f6e", "protocol": "http", "name": "default.httpbin.80", "read_timeout": 60000, "port": 80, "path": "/", "updated_at": 1614584110, "write_timeout": 60000, "retries": 5, "ws_id": "0dc6f45b-8f8d-40d2-a504-473544ee190b" }, "request": { "querystring": {}, "size": 192, "uri": "/sayHello1", "url": "http://localhost:80/sayHello1", "headers": { "host": "localhost", "content-type": "application/json", "user-agent": "curl/7.64.1", "accept": "/", "content-length": "57" }, "method": "POST" }, "client_ip": "192.168.65.3", "tries": [ { "balancer_latency": 0, "port": 3000, "balancer_start": 1614584261592, "ip": "10.1.0.78" } ], "upstream_uri": "/sayHello1", "response": { "headers": { "content-type": "application/json", "date": "Mon, 01 Mar 2021 07:37:41 GMT", "via": "kong/2.2.1", "connection": "close", "x-kong-proxy-latency": "0", "x-kong-upstream-latency": "26", "transfer-encoding": "chunked" }, "status": 200, "size": 272 }, "route": { "id": "5d1c5b31-f084-57b3-9eff-36a5a8b1c09e", "paths": [ "/" ], "protocols": [ "http", "https" ], "created_at": 1614584110, "ws_id": "0dc6f45b-8f8d-40d2-a504-473544ee190b", "service": { "id": "764d9f4d-c24c-5709-991c-19a96a581f6e" }, "name": "default.httpbin-svc-ingress.00", "updated_at": 1614584110, "preserve_host": true, "regex_priority": 200, "strip_path": false, "response_buffering": true, "https_redirect_status_code": 426, "path_handling": "v0", "request_buffering": true }, "started_at": 1614584261592 }

My concern is that is there any incorrect steps what I made, since, there is no request body ({"user_id": "123", "coin":100, "success":1, "msg":"OK!" }) logged in the "request" section of result?

My sample application is a simple rest controller of spring-boot as below, I could see the output in the application console. @PostMapping( value = "/sayHello1", consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE) public void sayHello1(@RequestBody Map<String, Object> params) { System.out.println(params); }

BR Kimi

quzhixue-Kimi commented 3 years ago

hi there,

Sorry for interrupting you, again.

Is it possible to add the response body within the response section? Such as one endpoint of my sample application is /hello, which returns a string with the value of ‘say hello2.0-gitlab’ . Will the string value be added into the response section, also?

Sorry for your inconvenience. BR Kimi