Open 4b4ee2e4-11b0-41f6-9cf3-b9ae77b3eb2d opened 3 years ago
Oh, and before you go too deep, probably worth confirming you clang binary is actually 64-bit ("file clang" or something like that).
For the recommended "string literal" value, should having a string like static char buff[NSIZE] = "\x7a\x6c\x69\x62\x20\x69..." be much less resource intensive?
Yes. I think last time someone looked, the memory usage for a string like that ends up at roughly 3x the size of the array? Vs. your original testcase, where it's on the order of 100x.
The ulimit look to be quite permissive, at least the important ones (if I am not mistaking) are unlimited. core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 127341 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 127341 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited
For the recommended "string literal" value, should having a string like static char buff[NSIZE] = "\x7a\x6c\x69\x62\x20\x69..." be much less resource intensive?
"out of memory" means that malloc failed. If you're using a 64-bit Linux clang binary, usually you'll trigger the OOM killer well before you actually hit a malloc failure, so you have some sort of unusual configuration. Maybe ulimit? In any case, you'll probably have to investigate yourself.
Note that files of that form use much more memory than you might expect at first glance because clang constructs multiple AST nodes for each byte. If you control the process generating these files, you might consider doing something else; for example, a string literal uses much less memory.
Extended Description
The following failure is seen frequently (but cannot be reproduced consistently) for some large files: "LLVM ERROR: out of memory".
The build machine has over 32GB or RAM and most of the time it shows that more than 20GB are free when these failures occur.
These large files where such failure occur, each has a single very large array used in testing of the form:
unsigned char pSource13[] = { 0x50, 0x4e, 0x6b, 0x57, 0x30, 0x71, 0x33, 0x59, 0x49, 0x31, 0x30, 0x30, .... 0x68, 0x36, 0x71, 0x55, 0x31, 0x66, 0x48, 0x54, 0x51, 0x58, 0x30, 0x3e, 0x55, 0x56, 0x53, 0x47 };
Attached are some of the trace files for this. I cannot attach the source file as it is extremely large (more than 150MB), but a trimmed down version was added for reference.
LLVM ERROR: out of memory Stack dump:
PLEASE ATTACH THE FOLLOWING FILES TO THE BUG REPORT: Preprocessed source(s) and associated run script(s) are located at: clang: note: diagnostic msg: /tmp/zlib_25MB_unique_srting_TC19_in-29a3fb.c clang: note: diagnostic msg: /tmp/zlib_25MB_unique_srting_TC19_in-29a3fb.sh clang: note: diagnostic msg: