cch1 / http.async.client

Async Http Client - Clojure
http://cch1.github.com/http.async.client
267 stars 40 forks source link

Memory leak #39

Closed Suor closed 12 years ago

Suor commented 12 years ago

Here is my code:

(ns grabber.core
  (:require [http.async.client :as http])
  (:require [http.async.client.request :as request]))

(defn closing [client func]
  (fn [& args] (do
    (http/close client)
    (apply func args)
  )))

(defn async-get [url on-compete on-error]
  (let [client (http/create-client :follow-redirects true)
        request (request/prepare-request :get url)]
    (request/execute-request client request
                             :completed (closing client on-compete)
                             :error (closing client on-error))))

and REPL:

user=> (use 'grabber.core)
nil
user=> (dotimes [n 50] 
         (async-get "http://some.url" 
           #(do % (println (str "done " n))) #(println %2)))
done 5
done 4
done 3
done 6
done 2
done 10
...

And a process gets bigger by 20-30 mb.

neotyk commented 12 years ago

Hi Alexander,

I see you do

client (http/create-client :follow-redirects true)

but you don't close client later on. It is preferred to use one instance of client. Please look at http://neotyk.github.com/http.async.client/docs.html#sec-2-6-12 and http://neotyk.github.com/http.async.client/docs.html#sec-2-6.

Please reopen this issue if with one client you can still get memory to leak.

Suor commented 12 years ago

Actually I close a client with a closing wrapper around callbacks. And I need separate client on each request to use different proxies, user agents and such.

neotyk commented 12 years ago

Reopening. Have to reproduce it.

Suor commented 12 years ago

I also have same leek in these two:

; This is for testing purposes only
(defn sync-get2 [url on-compete on-error]
  (with-open [client (http/create-client :follow-redirects true)]
    (let [request (request/prepare-request :get url)
          result (request/execute-request client request)]
          (http/await result)
          result)))

(defn sync-get3 [url on-compete on-error]
  (with-open [client (http/create-client :follow-redirects true)]
    (let [result (http/GET client url)]
      (http/await result)
      nil)))
Suor commented 12 years ago

I finally reduced it to

(defn touch-client []
  (let [client (http/create-client)]
    (http/close client)))

Looks like something is not destroyed properly during .close call.

neotyk commented 12 years ago

That narrows it down very nicely. Thank you for investigating it

Suor commented 12 years ago

Here is my project https://github.com/Suor/grabber

neotyk commented 12 years ago

With YourKit after executing touch-client 10000 all objects created by http.async.client are GCed. Also run sync-get3, sync-get2 and async-multi, didn't manage to reproduce. How did you detect memory leak? Did you get OutOfMemory ?

Suor commented 12 years ago

Here what I have:

suor@zenbook:~$ uname -a
Linux zenbook 3.2.14-suor-zbook #1 SMP Sun Apr 29 10:15:26 KRAT 2012 x86_64 x86_64 x86_64 GNU/Linux
suor@zenbook:~$ java -version
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.1) (6b24-1.11.1-4ubuntu3)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)

Then I go into my project https://github.com/Suor/grabber start lein repl and do:

user=> (use 'grabber.core)
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
nil

Then I run top in separate console, watch java -cp /home/suor/projects/grabber/test... process and execure this in repl:

user=> (dotimes [n 100] (touch-client))
nil

A process grows by about 50 mb, then I repeat and it grows again. And it do so until it is 1GB in size and then it stops growing for whatever reason. I don't get OutOfMemory, but still a process should not be that big.

neotyk commented 12 years ago

This is all good. Jvm is allocating all available memory. You can provide startup options to jvm to limit max heap size. HTH, Hubert.