beehive-lab / TornadoVM

TornadoVM: A practical and efficient heterogeneous programming framework for managed languages
https://www.tornadovm.org
Apache License 2.0
1.17k stars 110 forks source link

Allocate big arrays unsupported #335

Closed jjfumero closed 2 months ago

jjfumero commented 6 months ago

Describe the bug

When running Execution Plans with large allocations (e.g., Arrays with 1>= GBs), the TornadoVM runtime throws an exception regarding memory, even though the memory usage is set to large buffers:

tornado-test -V --fast --jvm="-Dtornado.device.memory=2048MB" uk.ac.manchester.tornado.unittests.multithreaded.MultiThreaded 
tornado --jvm "-Xmx6g -Dtornado.recover.bailout=False -Dtornado.unittests.verbose=True -Dtornado.device.memory=2048MB"  -m  tornado.unittests/uk.ac.manchester.tornado.unittests.tools.TornadoTestRunner  --params "uk.ac.manchester.tornado.unittests.multithreaded.MultiThreaded"
WARNING: Using incubator modules: jdk.incubator.vector
Test: class uk.ac.manchester.tornado.unittests.multithreaded.MultiThreaded
    Running test: test01                     ................  [FAILED] 
        \_[REASON] Unable to allocate 1073741848 bytes of memory.

Expected behavior

The TornadoVM runtime should either be able to allocate large buffers, or launch another type of exception that specifies that the new size is not supported. For example, in some platforms (like OpenCL), this might not be possible.

Computing system setup (please complete the following information):

jjfumero commented 2 months ago

This is now fixed. I will close this issue.