DonutEspresso / big-json

A stream based implementation of JSON.parse and JSON.stringify for big POJOs
MIT License
114 stars 9 forks source link

getting no data #35

Closed hp8wvvvgnj6asjm7 closed 1 year ago

hp8wvvvgnj6asjm7 commented 1 year ago

I tried loading a json file that is 20gb large. Not logging anything.

const fs = require('fs')
const path = require('path')
const json = require('big-json')

const readStream = fs.createReadStream('./partner_feed_de.json')
const parseStream = json.createParseStream()

console.log('wtf???') // this logs
parseStream.on('data', function(pojo) {
    // => receive reconstructed POJO
    console.log('wtf???')
    console.log(pojo)
})
parseStream.on('finish', (x,...a) => {

    console.log(x, a)
})

readStream.pipe(parseStream)

it works with smaller jsons, but the 20gb seems too much?