assemble / grunt-assemble-i18n

Assemble middleware for adding i18n support to projects.
24 stars 8 forks source link

Out of memory on compilation #37

Closed ain closed 10 years ago

ain commented 10 years ago

With pages collection we're currently running out of memory on large datasets:

FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
Abort trap: 6

Any ideas for a workaround?

ain commented 10 years ago

@doowb @jonschlinkert any smart ideas about how to prevent pages[] growing out of hand?

jonschlinkert commented 10 years ago

@ain have you confirmed that this is an assemble-specific issue? I've read a lot about memory leaks and limitations in node.js. e.g. http://blog.caustik.com/2012/04/11/escape-the-1-4gb-v8-heap-limit-in-node-js/

ain commented 10 years ago

@jonschlinkert I thought about memory leaks. I'm using the #33 and it's rather straightforward. I'm afraid it's just that the pages array grows over boundaries :(

ain commented 10 years ago

@jonschlinkert I'm also not sure if the leak is in this module at all. Since i18n returns pages array, the memory is fine. During the Assembling… the memory is constantly growing.

jonschlinkert commented 10 years ago

I'm afraid it's just that the pages array grows over boundaries

Well, consider that you have:

@doowb, any thoughts on this?

jonschlinkert commented 10 years ago

how many pages are you building @ain? also, I'm guessing it's closed-source, which is fine, but it would be great if there was a way for us to even emulate this so we can dig into it...

ain commented 10 years ago

@jonschlinkert I'm building 1452 pages, in 33 languages from a single template structure of 44 templates. Language file has 687 lines per average, ca. 30-40KB each.

LaurentGoderre commented 10 years ago

Is there a way you can naturally split it up?

ain commented 10 years ago

@LaurentGoderre that's what I'm currently thinking about, last resort. I don't want to couple the Gruntfile to content though, so I'm trying to come up with generic glob patterns, a convention.

ain commented 10 years ago

@jonschlinkert I'll look into ways I can provide you a package that you can use to reproduce the issue.

doowb commented 10 years ago

This is interesting... I was wondering when a memory limit in node would be reached. We don't pay any attention to memory so we just load all the data and get it from specific places when we need it.

If you don't have links between different languages, you might be able to dynamically add targets to the Gruntfile based on the language array so they each run separately. This will probably cause the build to run slower because it'll have to read in the data and template files for each target.

jonschlinkert commented 10 years ago

This will probably cause the build to run slower because it'll have to read in the data and template files for each target.

Actually it may run a hell of a lot faster b/c only the data required by each target would be used. but the key to your suggestion is "dynamically" add targets. might not be a bad way to go

ain commented 10 years ago

@jonschlinkert is there a way in Grunt context to escape the heap limit?

doowb commented 10 years ago

I took a look at the article @jonschlinkert linked and you should try:

node --max-old-space-size=8192 /usr/local/bin/grunt

The last part is the path to my grunt-cli, so basically you can set v8 flags when you call node directly but you can't when calling through grunt so you have to call the grunt-cli with node.

Also, the 8192 is in MBs so it's 8GB. Let me know if this does anything helpeful.

jonschlinkert commented 10 years ago

just a thought, but it might be better for @ain if we focus on getting v0.5 finished up instead of debugging issues on the current version. we're sooo close!

ain commented 10 years ago

True. And I'm a perfect stress-test.

Sent from Gmail Mobile

jonschlinkert commented 10 years ago

lol :+1:

ain commented 10 years ago

@doowb @jonschlinkert reporting back: ran the task through with 4096, ran through smoothly and much faster. No problems, grunt used something like 2.2 GB in the end.

jonschlinkert commented 10 years ago

:+1: should we close?

ain commented 10 years ago

Yup, let's close. The reference is here for those who need it.

jonschlinkert commented 10 years ago

thanks!

LaurentGoderre commented 10 years ago

Is there any other way than changing the heap size to fix this. It's highly problematic to ask all developer to start using this command

jonschlinkert commented 9 years ago

@LaurentGoderre, besides https://github.com/wet-boew/wet-boew, are there other projects that might have this issue? If so would you mind linking to them so we can take a look and see if we have any ideas?

ain commented 9 years ago

I implemented gc and unikeys feature to significantly cut the memory consumption on i18n.

The problem was, you load all the translations of all languages into the memory since any of the pages could use (and sometimes needs to) any of the languages in other translation files. Therefore the unikeys, defining the keys that should be universally available. Also, I found out, triggering gc regularly, helps. I have the fork and as soon as I have time, I'll push the PR.

jonschlinkert commented 9 years ago

sounds good, thanks

LaurentGoderre commented 9 years ago

I wonder if having an option to throttle would help as well