Open JRGranell opened 4 years ago
did you find any solution?
I had this issue too and after some troubleshooting gave up. Instead, I was able to convert my large parquet file to json using this Rust project. https://github.com/jupiter/parquet2json
I also had this issue with using repeated: true, and large amount of data. The issue is inside the rle and reader. Changing the code to use a safer array copy fixed the issue.
exports.arrayCopy = function(dest, src) { const len = src.length for(let i = 0; i < len; i++){ dest.push(src[i]) } }
I'm having this issue while trying to read a 1.7MB file. @jgold21 can you say a little more about how you fixed this issue? I can't see how to use your code in rel.js - but its probably a problem with my comprehension rather than your javascript :-D
I'm having the same issue as well, with a file with 13049 rows, reading only one of the columns. The workaround by @jgold21 doesn't seem to apply, there is no such function in the codebase anymore.
Hi, I'm trying to read a local file, approximate 1.8Gb with 18790733 rows, SNAPPY compression. On executing the following code in Node 12
is prints the row count, but throws this error on
cursor.next()
Would the file size or row count be too large for this to be processed? Alternatively, is there a way to stream the file to read/ decode one row at a time?
Thanks in advance,