Closed youzeliang closed 2 days ago
@youzeliang Thanks for reaching out. I'm currently on vacation days; I will be back in January.
I see that code in resty/v2:
func closeq(v interface{}) {
if c, ok := v.(io.Closer); ok {
silently(c.Close())
}
}
func silently(_ ...interface{}) {}
See client.go#12000
@youzeliang Thanks for reaching out. Based on the above details. Currently the method R().SetFormData
reads all the form data values into memory to create Request body for the request. That's why memory increases.
In the v3, there are plans to optimize the flow.
Based on the issue details, I'm still determining whether 4k of response body size will cause a memory leak.
However, the issue raises a good point about utilizing streaming vs read-all.
In Resty v3 development, the request and response flow has been redesigned and optimized to add additional features.
Response.Body
fieldResty v3 brings many new features and improvements that enhance its functionality and performance.
In most cases, I use the resty library's Post method to send POST requests, and everything works fine with no issues. However, in a particular scenario, when I receive a large response structure, I encounter a memory leak. Firstly, I want to clarify that my code doesn't have any goroutine leaks or lingering global variables (i.e., no delayed releases). Upon using pprof to inspect, it seems that the underlying io.readAll operation is consuming a significant amount of memory. Are there considerations for optimizing this part? I will attach my pprof results and some key information along with the crucial parts of the code.
resty.New(). OnBeforeRequest(func(client *resty.Client, r *resty.Request) error { return nil }). SetRetryCount(2). SetTimeout(time.Second * 8). R(). SetFormData(apiBody). SetResult(&apiRes). Post(url)
the response apiRes is about 4k