Closed yegor256 closed 8 years ago
@dmzaytsev there is a memory leakage somewhere... we're going over 512Mb in just about 10 minutes of work
@rultor release, tag is 0.22.5
@rultor release, tag is
0.22.5
@yegor256 OK, I will release it now. Please check the progress here
@rultor release, tag is
0.22.5
@yegor256 Done! FYI, the full log is here (took me 15min)
@karato valid bug
@dmzaytsev any ideas?
@dmzaytsev I increased the size of VM at Heroku to 1Gb RAM. the problem is still there. we're eating the entire memory in 20-30 minutes. there is definitely a memory leak somewhere... some static variables, I assume...
@yegor256 it can be both a memory leak and the normal memory consumation by jcabi-cacheble unfortunatelly jcabi-cacheble doen't allow to set a eviction policy for its cache, only by time we can't set a max size of the cache
@yegor256 let's shift back to PegDown for test
@dmzaytsev yeah, but we didn't have any issues for months... let's try the PegDown. now we're having serious issues. I have to restart Heroku dyno every 20 minutes.
@yegor256 I'm going to release a few latest changes now. and then will shift back to the Pegdown in a new release
@rultor release, tag=2.22.6
@rultor release, tag=
2.22.6
@dmzaytsev OK, I will release it now. Please check the progress here
@rultor release, tag=
2.22.6
@dmzaytsev Done! FYI, the full log is here (took me 11min)
@yegor256 do you see the memory leak after the latest release?
@dmzaytsev watching...
@karato assign me please
@yegor256 I'm ready to shift to the PegDown let me know if the problem is still there
@dmzaytsev yes, the problem is still there
@rultor release, tag=2.22.7
@karato valid bug
@dmzaytsev I tagged this as "bug"
@rultor release, tag=
2.22.7
@dmzaytsev OK, I will release it now. Please check the progress here
@rultor release, tag=
2.22.7
@dmzaytsev Done! FYI, the full log is here (took me 13min)
@yegor256 could you check again?
@dmzaytsev well... I don't see a problem any more. let's monitor for a few more hours
@dmzaytsev in the mean time, I would strongly recommend to create a few performance/stress tests, to prevent this from happening in the future. we MUST control how stable is the product BEFORE deploying it to production
@dmzaytsev nope, it didn't help. I put the memory limit back to 512m in Heroku and we see "memory quota exceeded" again, after about 15-20 minutes of work
@dmzaytsev just start a server locally, put big traffic on it (using Apache Bench for example) and see what's going on inside, using some profiler. you will find the reason of memory leak pretty soon.
@yegor256 It's not too simple task to emulate real workload locally... Just putting a big traffic on the root page not enough, we need make requests like a real user: log in, open bouts, post messages , scroll pages, etc.. I would propose to set up generating of memory dumps on Heroku as described here https://devcenter.heroku.com/articles/java-memory-issues#generating-heap-dumps-in-the-background
You can generate a memory dump every 30 mins and upload the dumps on a S3 bucket (see link above). Then we can analyze these dumps
@yegor256 I set the milestone to 3.1 since there is nothing set yet
@dmzaytsev you can download that dumps from http://jmap.netbout.com (login: team
, pwd: sugar16
) But keep in mind that I restart heroku process every 20 minutes now, by cron. So, the information may not be very helpful. I can stop that restarts process, but the system will go out of memory very soon...
@dmzaytsev I set jmap dump interval to 10 minutes:
heroku config:set JMAP_INTERVAL=10
@rultor release, tag=2.22.8
@rultor release, tag=
2.22.8
@dmzaytsev OK, I will release it now. Please check the progress here
@rultor release, tag=
2.22.8
@dmzaytsev Done! FYI, the full log is here (took me 17min)
@yegor256 it seems dumps appear more then every 20 mins would it possible display the creation time of the file?
@dmzaytsev they are created every 10 minutes... let's make them less frequently?
@dmzaytsev for the file creation time, submit a ticket to https://github.com/yegor256/s3auth
@yegor256 let's left it as is
@dmzaytsev they are created every 10 minutes... let's make them less frequently?
@yegor256 is it eat the memory right now? dumps not too big
@dmzaytsev well, I restart it every 20 minutes... I can stop that and the memory leak will show up... stopped. let's see
@yegor256 the biggest file is
@yegor256 but you said the memory limit is 512mb right?
@yegor256 well.. I'm not expert in the memory dump analyzing however as I can see there are two problems:
@karato assign me please
@dmzaytsev sure, the task is yours now
@dmzaytsev now it's 1Gb on Heroku, and we set 512mb for the JVM
@dmzaytsev I changed the interval: heroku config:set JMAP_INTERVAL=120
@yegor256 you don't restart by cron now, right?
@dmzaytsev I restart every 4 hours
@rultor release, tag=2.22.9
From Heroku logs: