Closed dcellwanger closed 3 years ago
@dcellwanger unfortunately, this is a limitation of the current Java client which only supports object sizes up to 2^31. The core source is that arrays in Java can only be indexed by signed 32-integers which means the largest array size in Java is 2^31-1 = 2,147,483,647. Although Rserve supports up to ~7e16 data sizes, the Java client does not.
FWIW the debugging DUMP code in Rserve also uses integers, but the actual payload is correct - the response header is:
Header: 01 00 01 00 b0 bf ed 9c 00 00 00 00 00 00 00 00
which is correctly 2632826800 bytes.
Hello @s-u ,
Thanks for developing and maintaining Rserve!
I am running into a problem with loading large sparse matrices with Rserve and I hope you can help with resolving it.
Please find the code to reproduce the error on Linux and macOS in the following.
Generating some data first:
The list consumes about 2.5GiB memory and the exported rds file is about 835M.
Running Rserve on a RHEL7 (AWS EC2 instance w/ 16 vCPUs and 128 Gib memory):
Then, loading the generated data in an Rserve session using Java with the REngine.jar/Rserve.jar libraries from the Rserve 1.8-7 R package causes the error:
Here is the tail of the stdout from
Rserve(debug=TRUE)
:The identical error can be reproduced on a macOS. Here is the session info for the macOS test run:
Thanks for your time! It is greatly appreciated!
Best regards, Daniel