Closed alenavorozhbieva closed 4 years ago
Thanks for the report and that you provided all that information. You may only have one volume but as the heap info above shows most of the memory used is in the http requrest handling support I bet the growth is actually proportional to the number of http requests handled. I bet you'd see a lot less growth if the server was not handling as many http requests.
That said the handling of requests, especially in server/rest/asynchttp.go, is probably sub-optimal. At least that is what I've believed for a while. We'd greatly appreciate it if you were to look into if this is the source of the growth!
@phlogistonjohn thank you for your reply. server/rest/asynchttp.go it is not in release/8 version yet. What should I look into instead ?
I didn't realize you were using a version that old. I'd suggest using the more recent v10.0.0 release and seeing if the problem is still exhibited by that version.
I deployed the 10 release - still having memory leak. New pprof: profile001.pdf
largest traces:
7.50MB net/textproto.(*Reader).ReadMIMEHeader
net/http.readRequest
net/http.(*conn).readRequest
net/http.(*conn).serve
4MB reflect.mapassign
reflect.Value.SetMapIndex
encoding/json.(*decodeState).object
encoding/json.(*decodeState).value
encoding/json.(*decodeState).unmarshal
encoding/json.Unmarshal
github.com/heketi/heketi/vendor/github.com/dgrijalva/jwt-go.(*Parser).ParseUnverified
github.com/heketi/heketi/vendor/github.com/dgrijalva/jwt-go.(*Parser).ParseWithClaims
github.com/heketi/heketi/vendor/github.com/dgrijalva/jwt-go.ParseWithClaims (inline)
github.com/heketi/heketi/middleware.(*JwtAuth).ServeHTTP
github.com/heketi/heketi/vendor/github.com/urfave/negroni.middleware.ServeHTTP
github.com/heketi/heketi/vendor/github.com/urfave/negroni.(*Logger).ServeHTTP
github.com/heketi/heketi/vendor/github.com/urfave/negroni.middleware.ServeHTTP
github.com/heketi/heketi/vendor/github.com/urfave/negroni.(*Recovery).ServeHTTP
github.com/heketi/heketi/vendor/github.com/urfave/negroni.middleware.ServeHTTP
github.com/heketi/heketi/vendor/github.com/urfave/negroni.(*Negroni).ServeHTTP
github.com/heketi/heketi/vendor/github.com/gorilla/mux.(*Router).ServeHTTP
net/http.serverHandler.ServeHTTP
net/http.(*conn).serve
Root cause of memory leak was found. It is because of gorilla context usage with version of Golang higher than 1.7. Heketi has golang 1.8. https://github.com/gorilla/context. This is the problem place in heketi code: https://github.com/heketi/heketi/blob/master/middleware/jwt.go#L190, that produces memory leak
@alenavorozhbieva Awesome, thanks for analyzing and pointing to where the problem is. Would you be interested to make the changes to replace gorilla/context with http.Request.Context?
Hi @raghavendra-talur! Me and @alenavorozhbieva work together and do not have the proper expertise in working with golang. We would be happy if you make a PR on your own for the problem we indicated. We are ready to test memory leaks fixes in our environment.
Kind of issue
Bug
Observed behavior
heketi memory usage permanently grows
Expected/desired behavior
memory usage is the same
Details on how to reproduce (minimal and precise)
I have just heketi with one volume- and it is reproduces
Information about the environment:
Other useful information
Memory usage of heketi container grows this way:
Also pprof top after heketi was just installed:
pprof top after 3 days:
Heap graph: pprof001.pdf
Example of heap trace: