heketi / heketi

RESTful based volume management framework for GlusterFS
Other
1.26k stars 434 forks source link

Memory leak #1777

Closed alenavorozhbieva closed 4 years ago

alenavorozhbieva commented 4 years ago

Kind of issue

Bug

Observed behavior

heketi memory usage permanently grows

Expected/desired behavior

memory usage is the same

Details on how to reproduce (minimal and precise)

I have just heketi with one volume- and it is reproduces

Information about the environment:

Other useful information

Memory usage of heketi container grows this way: Снимок экрана 2020-09-07 в 21 50 35

Also pprof top after heketi was just installed:

(pprof) top
Showing nodes accounting for 4.45MB, 100% of 4.45MB total Showing top 10 nodes out of 66 flat flat% sum% cum cum% 1.50MB 33.72% 33.72% 1.50MB 33.72% runtime.mapassign 1.07MB 23.99% 57.71% 1.07MB 23.99% compress/flate.(compressor).init 0.88MB 19.80% 77.51% 1.95MB 43.79% compress/flate.NewWriter 0.50MB 11.26% 88.77% 0.50MB 11.26% reflect.unsafe_NewArray 0.50MB 11.23% 100% 0.50MB 11.23% net/url.parse 0 0% 100% 1.95MB 43.79% compress/flate.NewWriterDict 0 0% 100% 1.95MB 43.79% compress/zlib.(Writer).Write 0 0% 100% 1.95MB 43.79% compress/zlib.(Writer).writeHeader 0 0% 100% 1.95MB 43.79% encoding/binary.Write 0 0% 100% 0.50MB 11.26% encoding/json.(decodeState).array

pprof top after 3 days:

(pprof) top Showing nodes accounting for 88.02MB, 91.65% of 96.03MB total Showing top 10 nodes out of 119 flat flat% sum% cum cum% 30.01MB 31.25% 31.25% 32.52MB 33.87% runtime.mapassign 17MB 17.70% 48.95% 17MB 17.70% runtime.rawstringtmp 11MB 11.46% 60.41% 11MB 11.46% net/http.(Request).WithContext 6MB 6.25% 66.66% 6MB 6.25% runtime.makemap 5MB 5.21% 71.87% 5MB 5.21% net/url.parse 5MB 5.21% 77.07% 5MB 5.21% context.WithCancel 4MB 4.17% 81.24% 4MB 4.17% reflect.unsafe_New 3.50MB 3.64% 84.88% 34.01MB 35.41% net/textproto.(Reader).ReadMIMEHeader 3.50MB 3.64% 88.53% 3.50MB 3.64% net/http.readTransfer 3MB 3.12% 91.65% 3MB 3.12% runtime.convT2E

Heap graph: pprof001.pdf

Example of heap trace:

13MB runtime.mapassign net/textproto.(Reader).ReadMIMEHeader net/http.readRequest net/http.(conn).readRequest net/http.(*conn).serve runtime.goexit

phlogistonjohn commented 4 years ago

Thanks for the report and that you provided all that information. You may only have one volume but as the heap info above shows most of the memory used is in the http requrest handling support I bet the growth is actually proportional to the number of http requests handled. I bet you'd see a lot less growth if the server was not handling as many http requests.

That said the handling of requests, especially in server/rest/asynchttp.go, is probably sub-optimal. At least that is what I've believed for a while. We'd greatly appreciate it if you were to look into if this is the source of the growth!

alenavorozhbieva commented 4 years ago

@phlogistonjohn thank you for your reply. server/rest/asynchttp.go it is not in release/8 version yet. What should I look into instead ?

phlogistonjohn commented 4 years ago

I didn't realize you were using a version that old. I'd suggest using the more recent v10.0.0 release and seeing if the problem is still exhibited by that version.

alenavorozhbieva commented 4 years ago

I deployed the 10 release - still having memory leak. New pprof: profile001.pdf

largest traces:

7.50MB   net/textproto.(*Reader).ReadMIMEHeader
         net/http.readRequest
         net/http.(*conn).readRequest
         net/http.(*conn).serve

   4MB   reflect.mapassign
         reflect.Value.SetMapIndex
         encoding/json.(*decodeState).object
         encoding/json.(*decodeState).value
         encoding/json.(*decodeState).unmarshal
         encoding/json.Unmarshal
         github.com/heketi/heketi/vendor/github.com/dgrijalva/jwt-go.(*Parser).ParseUnverified
         github.com/heketi/heketi/vendor/github.com/dgrijalva/jwt-go.(*Parser).ParseWithClaims
         github.com/heketi/heketi/vendor/github.com/dgrijalva/jwt-go.ParseWithClaims (inline)
         github.com/heketi/heketi/middleware.(*JwtAuth).ServeHTTP
         github.com/heketi/heketi/vendor/github.com/urfave/negroni.middleware.ServeHTTP
         github.com/heketi/heketi/vendor/github.com/urfave/negroni.(*Logger).ServeHTTP
         github.com/heketi/heketi/vendor/github.com/urfave/negroni.middleware.ServeHTTP
         github.com/heketi/heketi/vendor/github.com/urfave/negroni.(*Recovery).ServeHTTP
         github.com/heketi/heketi/vendor/github.com/urfave/negroni.middleware.ServeHTTP
         github.com/heketi/heketi/vendor/github.com/urfave/negroni.(*Negroni).ServeHTTP
         github.com/heketi/heketi/vendor/github.com/gorilla/mux.(*Router).ServeHTTP
         net/http.serverHandler.ServeHTTP
         net/http.(*conn).serve
alenavorozhbieva commented 4 years ago
heketi_10_mem_usage
alenavorozhbieva commented 4 years ago

Root cause of memory leak was found. It is because of gorilla context usage with version of Golang higher than 1.7. Heketi has golang 1.8. https://github.com/gorilla/context. This is the problem place in heketi code: https://github.com/heketi/heketi/blob/master/middleware/jwt.go#L190, that produces memory leak

raghavendra-talur commented 4 years ago

@alenavorozhbieva Awesome, thanks for analyzing and pointing to where the problem is. Would you be interested to make the changes to replace gorilla/context with http.Request.Context?

iLeonidze commented 4 years ago

Hi @raghavendra-talur! Me and @alenavorozhbieva work together and do not have the proper expertise in working with golang. We would be happy if you make a PR on your own for the problem we indicated. We are ready to test memory leaks fixes in our environment.