Closed suenot closed 4 years ago
Bulk data loading section should help
I write a simple example, but I'm not sure, that this solution is the best:
const options = <OPTIONS>
const ClickHouse = require('@apla/clickhouse')
const ch = new ClickHouse(options)
var counter = 0
const sleep = (ms) => {
return new Promise(resolve => setTimeout(resolve, ms))
}
const main = async function() {
while (true) {
const writableStream = ch.query(`INSERT INTO tutorial.some`, (err) => {
// console.log(err)
console.log(`Insert complete, ${counter}! ${new Date()}`)
})
for(var i=0; i<500000; i++) {
await writableStream.write(['2000-01-07', counter, 'some'])
counter += 1
}
await writableStream.end()
await sleep(1) // doesn't work, I don't know why
}
}
main()
Is it right to use for
cycle to add many javascript objects?
const stream = ch.query('INSERT INTO table', { format: 'JSONEachRow' })
const someArray = [{name: 'Foo'}, {name: 'Bar'}]
for(var i=0; i<someArray.lenght; i++) {
stream.write(someArray[i])
}
stream.end()
Its ok to use synchronous loops for small amount of data, so the second example looks like exactly what you need. (You may also find useful for .. of
loop).
You don't need to worry about sync or async nature of your stream.write
because its better to have async data source and pipe it into writable stream. (Example).
PS. Don’t forget to specify callback ch.query
or end
together with error
stream event handlers.
PS2. You’ve passing 1 to sleep
fn when you need a 1000
Thanks, @nezed. You really help me solve my problem.
In documentation there is only section "Insert single row of data:".