Closed ohlr closed 3 years ago
Probably not much that can be done about that. To review, it needs to build the table of all annotations. The "image index". If you don't have enough memory available, then ...?
The size of the data set is not important. The critical part is the size and the number of annotations you have. The number of annotations is shown on the launcher screen. See the row called number of marks
. I have one project where this number is 107K, but each annotation is relatively small. When I review that project, I can see my system memory usage increases from 3.2 GB to 9.6 GB. This computer (actually VM) is also a 16 GB system.
Can you provide details on your images/annotations and memory usage?
Would it be possible to load them into cache sequentially? Similar to some form of pagination (pattern from webdev)
The launcher page has an exclusion regex you can apply. So if like me you split your images into sets, you can exclude some of the sets, run the "review" functionality, and then update the regex to test the 2nd set, etc.
For example, set_01
through set_07
are excluded in this screenshot, and only images from set_08
and higher would be included:
For example,
set_01
throughset_07
are excluded in this screenshot, and only images fromset_08
and higher would be included:
This should solve it. If you want I can close the issue.
Maybe it would be good however to not fail silently.
By "silently" I'm guessing you mean the Linux kernel OOM killer. Unfortunately, when Linux decides it is out of ram and it will kill a process to get some ram back, it doesn't warn the application. Only way to tell what happened is to run something like dmesg
to see the OOM messages.
Only thing I can think which might help is to resize the images to the exact size needed while building the index, but that will only help people who have extremely large annotations. Will experiment with this and see if it makes a significant difference.
If you get a chance, try the latest version (1.5.17-1). I've significantly reduced the amount of memory used when bringing up the review window.
When reviewing marks on a large dataset I get an out of memory error (16GB ram) and DarkMark crashes.