Closed cngdkxw closed 2 years ago
Hi,
Since I found the memory usage was at 500M from the very beginning, and other indicators were at a low level. So please try the following steps:
yaml
file or the namespace default settings, even system-wide limitation of your kubernetes cluster. SIGTSTP
to pipy container, and paste the log here. This would help you find what take so much memory. If you can enter the shell env of pipy container, just try kill -SIGTSTP <PID of Pipy>
.1.容器的内存限制是预留4M,限制2G
2.容器内部执行kill -SIGTSTP
total 541356 528091 526720 0
3.pipy镜像升级到flomesh/pipy-pjs:0.4.0-312,重启几次都没再出现OOMKilled的现象,内存占用也稳定在10M左右
kill -SIGTSTP
[samples-discovery-server]:8761 1 0 0 1 1 [10.22.4.192]:6060 1 0 0 1 1 TOTAL 2 0 0 1 1
kill -SIGTSTP 执行结果如下:
CLASS #INSTANCES
ContextData 3
Object 61 pipy::Configuration 11 pipy::Console 1 pipy::Data 27 pipy::Global 1 pipy::Hessian 1 pipy::Inbound 5 pipy::JSON 1 pipy::Message 4 pipy::Netmask 1 pipy::OS 1 pipy::Pipy 1 pipy::StreamEnd 1 pipy::URL 2 pipy::URLSearchParams 2 pipy::XML 1 pipy::algo::Algo 1 pipy::algo::ResourcePool 1 pipy::algo::RoundRobinLoadBalancer 9 pipy::crypto::Crypto 1 pipy::http::Http 1 pipy::http::RequestHead 3 pipy::http::ResponseHead 2 pjs::Array 7 pjs::Constructorpipy::Data 1 pjs::Constructorpipy::Message 1 pjs::Constructorpipy::MessageEnd 1 pjs::Constructorpipy::MessageStart 1 pjs::Constructorpipy::Netmask 1 pjs::Constructorpipy::StreamEnd 1 pjs::Constructorpipy::URL 1 pjs::Constructorpipy::URLSearchParams 1 pjs::Constructorpipy::XML::Node 1 pjs::Constructorpipy::algo::Cache 1 pjs::Constructorpipy::algo::HashingLoadBalancer 1 pjs::Constructorpipy::algo::LeastWorkLoadBalancer 1 pjs::Constructorpipy::algo::Percentile 1 pjs::Constructorpipy::algo::ResourcePool 1 pjs::Constructorpipy::algo::RoundRobinLoadBalancer 1 pjs::Constructorpipy::algo::URLRouter 1 pjs::Constructorpipy::crypto::Certificate 1 pjs::Constructorpipy::crypto::CertificateChain 1 pjs::Constructorpipy::crypto::Cipher 1 pjs::Constructorpipy::crypto::Decipher 1 pjs::Constructorpipy::crypto::Hash 1 pjs::Constructorpipy::crypto::Hmac 1 pjs::Constructorpipy::crypto::JWK 1 pjs::Constructorpipy::crypto::JWT 1 pjs::Constructorpipy::crypto::PrivateKey 1 pjs::Constructorpipy::crypto::PublicKey 1 pjs::Constructorpipy::crypto::Sign 1 pjs::Constructorpipy::crypto::Verify 1 pjs::Constructorpipy::http::File 1 pjs::Constructorpjs::Array 1 pjs::Constructorpjs::Boolean 1 pjs::Constructorpjs::Date 1 pjs::Constructorpjs::Number 1 pjs::Constructorpjs::Object 1 pjs::Constructorpjs::RegExp 1 pjs::Constructorpjs::String 1 pjs::Function 38 pjs::RegExp 1 TOTAL 224 DATA CURRENT(KB) PEAK(KB)
Unknown 0 0
Script 0 0 HTTP Encoder 0 20 HTTP2 Codec 0 0 connectSOCKS 0 0 TLS 0 0 Message 8 8 Command Line Options 0 0 Outbound 116 136 os.readFile 0 0 inflate 0 8 JSON 0 8 Inbound 32 32 pack 0 0 TOTAL 156 n/a PIPELINE #ALLOCATED #ACTIVE
/main.js [:::8081] 1 0
/main.js [:::8113] 0 0 /main.js [:::8771] 2 2 /main.js [eureka] 1 0 /main.js [inbound] 0 0 /main.js [outbound] 1 0 /plugins/balancer.js [connection] 1 0 /plugins/balancer.js [forward] 1 0 /plugins/balancer.js [load-balance] 1 0 /plugins/balancer.js [request] 1 0 /plugins/balancer.js [session] 1 0 /plugins/default.js [request] 0 0 /plugins/eureka.js [Task #1] 1 0 /plugins/eureka.js [connection] 1 1 /plugins/eureka.js [forward] 1 0 /plugins/header-injection.js [request] 1 0 /plugins/inbound/ban.js [bypass] 0 0 /plugins/inbound/ban.js [deny] 0 0 /plugins/inbound/ban.js [request] 0 0 /plugins/inbound/circuit-breaker.js [circuit-break] 0 0 /plugins/inbound/circuit-breaker.js [request] 0 0 /plugins/inbound/circuit-breaker.js [session] 0 0 /plugins/inbound/inbound.js [connection] 0 0 /plugins/inbound/inbound.js [request] 0 0 /plugins/inbound/throttle.js [bypass] 0 0 /plugins/inbound/throttle.js [request] 0 0 /plugins/inbound/throttle.js [throttle] 0 0 /plugins/logger.js [log-request] 1 0 /plugins/logger.js [log-response] 1 0 /plugins/logger.js [log-send] 1 0 /plugins/logger.js [request] 1 0 /plugins/logger.js [response] 1 0 /plugins/router.js [request] 1 0 /plugins/router.js [session] 1 0 [Fetch Connection] 1 1 [Fetch] 2 0 TOTAL 24 4 INBOUND #CONNECTIONS BUFFERED(KB)
8081 0/1 0
8113 0/0 0 8771 2/2 0 TOTAL 2 0 OUTBOUND #CONNECTIONS BUFFERED(KB) #OVERFLOWED MAX_CONN_TIME AVG_CONN_TIME [samples-discovery-server]:8761 1 0 0 1 1 [10.22.4.192]:6060 1 0 0 1 1 TOTAL 2 0 0 1 1
from this memory usage dump, it looks good. If version flomesh/pipy-pjs:0.4.0-312 runs well for several days, then let's close this issue.
此现象是按照https://github.com/flomesh-io/service-mesh-demo 搭建demo时出现的。
pipy镜像版本:flomesh/pipy-pjs:0.4.0-263
以pod的sidecar形式启动pipy容器时,有可能很快出现OOMKilled的情况,重启几次后,便不再报错了
查看pipy容器日志时,发现日志停在这里,如果是正常启动的pipy,后续会打印更多日志
查看pipy容器的内存指标,发现内存占用500多M,这个图显示内存占用稳定在500多M,但是有时会降下去,只占用几M内存