Open h3ph4est7s opened 5 years ago
@h3ph4est7s, do you have a Java heap dump file (.hprof) from the JVM that you could share with me? The stack trace shows me exactly what code was executing when we ran out of memory, but it does not tell me what was actually using the memory the way a heap dump would.
@rcordovano unfortunately this is a production machine and currently contains evidence so i cannot transfer any kind of sensitive data outside of this environment. I believe this error is related to an effort of loading indexed entries to memory for speed but not sure if this is greater than 72GB. Btw, thank you for your interest. Also the system never allocated all 72GB of heap memory, usually it throws after the button is pressed.
@h3ph4est7s, I would love to help you but please don't give me too much credit for my interest because I am the tech lead for Autopsy development here at Basis Technology. Unfortunately, the stack trace and the other information you have provided so far is just not enough for me to go on - it's the nature of the beast when heap memory is exhausted.
Your last remark is, however, intriguing. When you speak of "loading indexed entries to memory for speed" what are you doing, exactly? Based on the stack trace, perhaps you defined a keyword search that would return hits for most / all of the content in the case using the UI widgets in the upper right hand corner and then pressed the search button and the exception occurred? I.e., maybe "loading all the indexed entries into memory" means you wanted all of the files in the case that were indexed for search to appear in a single tab in the right hand side of the window so that you could scroll through them looking at the text in the Indexed Text tab? Just speculating here...and leaving the office for the day.
oh i see, sorry for my ignorance and congratulations for the amazing job with this platform. Let me give you some insight about this case. It's comprised of 4 data sources of 500GB each. I have run the keyword search module with a keyword list for every data source. The problem began when i disabled the periodic search for speed purposes and the show keyword preview option. When i was done with the ingestion process after many days i noticed that the search results didn't appear under Results - Keyword Hits as usual. When i realized that i tried to run a manual search using the Keyword Lists drop down search. And i ended up with this error 😢
I believe i found out what is wrong here. This error is originated from solr server. The aforementioned claim can be verified by using 10gb max heap size in start.jar
initial execution. This mitigate the OutOfMemoryError
and during observation with VisualVM the heap steadily grows in start.jar
around 650mb +/-
. The feature of dinamically setting this value is not yet implemented. Solr arguments are hardcoded and in the TODO
list as observed here
https://github.com/sleuthkit/autopsy/blob/5964efb5b53c826065151145b42637cdeb108c73/KeywordSearch/src/org/sleuthkit/autopsy/keywordsearch/Server.java#L185
and here
https://github.com/sleuthkit/autopsy/blob/5964efb5b53c826065151145b42637cdeb108c73/KeywordSearch/src/org/sleuthkit/autopsy/keywordsearch/Server.java#L364
@h3ph4est7s, thank you very much, your analysis is very helpful. I am going to write this up in our internal issue tracking system and I will put an engineer on it as soon as I can.
@h3ph4est7s I find it interesting the Solr encountered an OOME when you were running a query. The advice we've seen suggests keeping the Solr heap small to allow as much of the index to be loaded into file system cache as possible. Can you share some more details with us?
Thanks.
Changes have been made such that the next version of Autopsy (64 bit) will include the ability to configure the maximum heap size for the embedded Solr server.
I believe i found out what is wrong here. This error is originated from solr server. The aforementioned claim can be verified by using 10gb max heap size in
start.jar
initial execution. This mitigate theOutOfMemoryError
and during observation with VisualVM the heap steadily grows instart.jar
around650mb +/-
. The feature of dinamically setting this value is not yet implemented. Solr arguments are hardcoded and in theTODO
list as observed here autopsy/KeywordSearch/src/org/sleuthkit/autopsy/keywordsearch/Server.javaLine 185 in 5964efb
private static final int MAX_SOLR_MEM_MB = 512; //TODO set dynamically based on avail. system resources
and here autopsy/KeywordSearch/src/org/sleuthkit/autopsy/keywordsearch/Server.java
Line 364 in 5964efb
private Process runSolrCommand(List
solrArguments) throws IOException {
I've had problems with out of memory errors with the keyword search module. After some tests I found out that you can stop the solr server (responsible for the keyword search module) and start it again with different memory parameters. This is what I've done:
This procedure also works while autopsy is running the keyword search, but I won't recommend doing it.
Im getting the following error when i try to search a very large case database for keywords
Product version: 4.9.1 System Memory: 84GB VM Heap size: 72GB (Also tried with 8GB) Total indexed files: 5.501.250 Total chunks in index: 9.175.626