apache / apisix-go-plugin-runner

Go Plugin Runner for APISIX
https://apisix.apache.org/
Apache License 2.0
173 stars 68 forks source link

request help: Whether the plugin is performant #73

Closed Horus-K closed 2 years ago

Horus-K commented 2 years ago

Issue description

I use the official demo to directly return the html file, but its performance is not good

Environment

package plugins

import ( "encoding/base64" "encoding/json" pkgHTTP "github.com/apache/apisix-go-plugin-runner/pkg/http" "github.com/apache/apisix-go-plugin-runner/pkg/log" "github.com/apache/apisix-go-plugin-runner/pkg/plugin" "net/http" )

func init() { log.Infof("init开始...") err := plugin.RegisterPlugin(&Say{}) if err != nil { log.Fatalf("注册插件失败 say: %s", err) } }

// Say is a demo to show how to return data directly instead of proxying // it to the upstream. type Say struct { }

type SayConf struct { Body string json:"body" }

func (p *Say) Name() string { return "say" }

func (p *Say) ParseConf(in []byte) (interface{}, error) { log.Infof("解析配置开始...") conf := SayConf{} err := json.Unmarshal(in, &conf) return conf, err }

func (p *Say) Filter(conf interface{}, w http.ResponseWriter, r pkgHTTP.Request) { //获取配置文件对应值 body := conf.(SayConf).Body

if len(body) == 0 {
    log.Errorf("Body为空")
    return
}
resBody, err := base64.StdEncoding.DecodeString(body)
if err != nil {
    log.Errorf("Base64解码不正确")
    return
}
w.Header().Set("Content-Type", "text/html")
_, err = w.Write([]byte(resBody))
if err != nil {
    log.Errorf("failed to write: %s", err)
}

}

Horus-K commented 2 years ago

plugins conf


"plugins": {
    "ext-plugin-post-req": {
      "conf": [
        {
          "name": "say",
          "value": "{\"body\":\"PCFET0NUWVBFIGh0bWw+PGh0bWw+CiAgPGhlYWQ+CiAgICA8bGluayByZWw9Imljb24iIGhyZWY9Ii8vY2RuLnhpYW95dWFuaGFvLmNvbS90ZXN0L21vYmlsZS1uYnVncy1mcm9udGVuZC8xLjAuMTkvZmF2aWNvbi5wbmciIHR5cGU9ImltYWdlL3gtaWNvbiI+CiAgICA8bGluawogICAgICByZWw9InN0eWxlc2hlZXQiCiAgICAgIGhyZWY9Ii8vY2RuLnhpYW95dWFuaGFvLmNvbS90ZXN0L21vYmlsZS1uYnVncy1mcm9udGVuZC8xLjAuMTkvdW1pLmNzcyIKICAgIC8+CiAgICA8bWV0YSBuYW1lPSJ4LXNlcnZlci1lbnYiIGNvbnRlbnQ9InRlc3QiIC8+CiAgICA8bWV0YSBjaGFyc2V0PSJ1dGYtOCIgLz4KICAgIDxtZXRhCiAgICAgIG5hbWU9InZpZXdwb3J0IgogICAgICBjb250ZW50PSJ3aWR0aD1kZXZpY2Utd2lkdGgsaW5pdGlhbC1zY2FsZT0xLG1heGltdW0tc2NhbGU9MSxtaW5pbXVtLXNjYWxlPTEsdXNlci1zY2FsYWJsZT1ubyIKICAgIC8+CiAgICA8c2NyaXB0IHNyYz0iLy9zLnhpYW95dWFuaGFvLmNvbS9uYnVncy1jZG4vanMvMC4wLjEvbW9iaWxlLWdsb2JhbC11dGlscy5qcyI+PC9zY3JpcHQ+CiAgICA8dGl0bGU+PC90aXRsZT4KICAgIDxsaW5rIHJlbD0ic3R5bGVzaGVldCIgdHlwZT0idGV4dC9jc3MiIGhyZWY9Ii8vcy54aWFveXVhbmhhby5jb20vY29tbW9uL2Nzcy9hbnRkLW1vYmlsZS1tb2RhbC5jc3MiLz4KICAgIDxsaW5rIHJlbD0ic3R5bGVzaGVldCIgaHJlZj0iLy9zLnhpYW95dWFuaGFvLmNvbS9jc3Mvd2V1aS8xLjAuMS9zdHlsZS93ZXVpLm1pbi5jc3MiPgogICAgPGxpbmsgcmVsPSJzdHlsZXNoZWV0IiBocmVmPSIvL3MueGlhb3l1YW5oYW8uY29tL3dldWkteC8xLjEuNC9pbmRleC5taW4uY3NzIj4KICAgIDxzY3JpcHQgdHlwZT0idGV4dC9qYXZhc2NyaXB0IiBzcmM9Ii8vc3RhdGljLnJ1bm9vYi5jb20vYXNzZXRzL3FyY29kZS9xcmNvZGUubWluLmpzIj48L3NjcmlwdD4KICAgIDxzY3JpcHQgdHlwZT0idGV4dC9qYXZhc2NyaXB0IiBzcmM9Ii8vcy54aWFveXVhbmhhby5jb20vZnJvbnQtZW5kLW1vbml0b3IvMS42LjAvdHJhY2tlci5qcyI+PC9zY3JpcHQ+PHNjcmlwdCBzcmM9Ii8vcy54aWFveXVhbmhhby5jb20vZXJ1ZGEvMS41LjIvZXJ1ZGEubWluLmpzIj48L3NjcmlwdD4KICAgICAgPHNjcmlwdD4KICAgICAgICBlcnVkYS5pbml0KCk7CiAgICAgIDwvc2NyaXB0PjxzdHlsZT4KICAgICAgI3Jvb3R7CiAgICAgICAgYmFja2dyb3VuZC1jb2xvcjogI2Y3ZjdmNzsKICAgICAgfQogICAgICA8L3N0eWxlPgogIDwvaGVhZD4KICA8Ym9keT4KICAgIDxkaXYgaWQ9InJvb3QiPjwvZGl2PgogICAgPHNjcmlwdCBzcmM9Ii8vY2RuLnhpYW95dWFuaGFvLmNvbS90ZXN0L21vYmlsZS1uYnVncy1mcm9udGVuZC8xLjAuMTkvdW1pLmpzIj48L3NjcmlwdD4KICA8L2JvZHk+CjwvaHRtbD4=\"}"
        }
      ],
      "disable": false
    }
  }
shuaijinchao commented 2 years ago

hi, @Horus-K I used did comparison test with your configuration, and the average processing time of GoRunner and APISIX native plugins in the same scenario is nearly 3 times different, QPS GoRunner is about 30% lower. The current consumption may mainly come from data replication between processes and parsing of protocol data in Runner.

html response

# GoRunner
$ wrk -t 8 -c 200 -d 10s http://192.168.56.199:9080/hello
Running 10s test @ http://192.168.56.199:9080/hello
  8 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    96.26ms  215.11ms   1.94s    93.97%
    Req/Sec   533.43    230.84     2.01k    73.43%
  42405 requests in 10.03s, 64.99MB read
Requests/sec:   4228.93
Transfer/sec:      6.48MB

# APISIX
$ wrk -t 8 -c 200 -d 10s http://192.168.56.199:9080/world
Running 10s test @ http://192.168.56.199:9080/world
  8 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    39.52ms   50.57ms 830.05ms   96.40%
    Req/Sec   767.96    104.59     1.42k    88.25%
  61186 requests in 10.01s, 93.77MB read
Requests/sec:   6112.90
Transfer/sec:      9.37MB

say hello response

$ wrk -t 8 -c 200 -d 10s http://192.168.56.199:9080/hello
Running 10s test @ http://192.168.56.199:9080/hello
  8 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.71s     1.61s    6.91s    48.98%
    Req/Sec        nan       nan   0.00      0.00%
  52485 requests in 10.01s, 10.46MB read
Requests/sec:   5244.93
Transfer/sec:      1.05MB

$ wrk -t 8 -c 200 -d 10s http://192.168.56.199:9080/world
Running 10s test @ http://192.168.56.199:9080/world
  8 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   524.45ms  563.29ms   2.39s    81.96%
    Req/Sec        nan       nan   0.00      0.00%
  75067 requests in 10.00s, 13.31MB read
Requests/sec:   7505.93
Transfer/sec:      1.33MB
Horus-K commented 2 years ago

During the stress test, I found that the runner always has only one process. Can the runner do multi-process processing?

shuaijinchao commented 2 years ago

Although I haven't experimented it yet, this method may not be the optimal solution. At present, the Runner program runs as a sub-process of Nginx. If it has multiple processes, it may require a lot of work. including data between processes shared and multi-process management.

Horus-K commented 2 years ago

We have deployed the runner to the development environment and are looking forward to subsequent updates