KxSystems / rkdb

R client for kdb+
https://code.kx.com/q/interfaces
Apache License 2.0
41 stars 29 forks source link

Segfault when transferring objects >2gb #16

Closed gerrymanoim closed 6 years ago

gerrymanoim commented 7 years ago

Strange issue, rkdb seems to segfault with a memory not mapped error (bringing down the entire R process) when transferring objects larger than about 2gb. This is using a 64-bit linux machine and the professional version of kdb+/q.

statquant commented 7 years ago

Can you provide a reproducible example ?

gerrymanoim commented 7 years ago

In q with a set port:

''' t:til 312500000 '''

In R:

''' library(rkdb)

set URL and port variables as appropriate

qcon <- rkdb::open_connection(url,port) out <- rkdb::execute(qcon, "t") '''

Crashes my R session with the error:

''' caught segfault address 0x263a000, cause 'memory not mapped' Segmentation fault '''

On Mon, Aug 28, 2017 at 14:59 statquant notifications@github.com wrote:

Can you provide a reproducible example ?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/KxSystems/rkdb/issues/16#issuecomment-325446570, or mute the thread https://github.com/notifications/unsubscribe-auth/AAL2Y2clZRFfalq0JxMwpw-P4gFklB6Kks5scw4qgaJpZM4PD6ve .

gerrymanoim commented 7 years ago

Is this a reproducible issue for anyone else or just specific to my particular instance?

sv commented 7 years ago

Currently, none of the client libraries support message sizes >2GB. This includes c.o which is used by R interface. You would need to split your data and fetch in chunks. It doesn't crash for me

> qcon <- rkdb::open_connection()
> out <- rkdb::execute(qcon, "til 312500000")
Error in rkdb::execute(qcon, "til 312500000") : Error from kdb+: `limit

What version of kdb+ are you using and what version of rkdb?

gerrymanoim commented 7 years ago

Sorry should have mentioned that earlier.

.z.K - 3.5 .z.k - 2017.04.06 .z.o - 'l64

On Wed, Sep 6, 2017 at 05:14 Sergey Vidyuk notifications@github.com wrote:

This is known issue. Currently, none of the client libraries support message sizes >2GB. This includes c.o which is used by R interface. You would need to split your data and fetch in chunks. It doesn't crash for me

qcon <- rkdb::open_connection() out <- rkdb::execute(qcon, "til 312500000") Error in rkdb::execute(qcon, "til 312500000") : Error from kdb+: `limit

What version of kdb+ are you using and what version of rkdb?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/KxSystems/rkdb/issues/16#issuecomment-327424866, or mute the thread https://github.com/notifications/unsubscribe-auth/AAL2Y2w0iImCUb23d_EFKCsBquJC0Y9Vks5sfmJ9gaJpZM4PD6ve .

sv commented 7 years ago

I cannot reproduce this. Is there any specific command line params you have for kdb+ process? Any errors on kdb side before R crashes? Could you install latest version of rkdb from github and see if it still happens?

gerrymanoim commented 7 years ago

Interesting, thanks for trying it out.

Nothing special besides rlwrap. If I use q 3.2 I get that same limit error. It's just on 3.5 that this segfaults.

I will try the latest version. On Wed, Sep 6, 2017 at 08:53 Sergey Vidyuk notifications@github.com wrote:

I cannot reproduce this. Is there any specific command line params you have for kdb+ process? Any errors on kdb side before R crashes? Could you install latest version from github and see if it still happens?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/KxSystems/rkdb/issues/16#issuecomment-327473769, or mute the thread https://github.com/notifications/unsubscribe-auth/AAL2Y1xkQmWdW9SxrWjH6okW6NgvKpnhks5sfpW2gaJpZM4PD6ve .

sv commented 6 years ago

Did it work for you with 3.5?

gerrymanoim commented 6 years ago

Still segfaults unfortunately. Must be something with my particular environment. Appreciate the help.