typicode / lowdb

Simple and fast JSON database
MIT License
21.35k stars 918 forks source link

Question: Does the Database read itself on every new push? #535

Closed Creative-Difficulty closed 2 years ago

Creative-Difficulty commented 2 years ago

So without lowDB I have to either add all data to an Array, then write that Array to a file, or write the data to the file and when appending new data read the entire file and then append the data to the "scanned" version of the file, then clear the file and rewrite. I'm continuously writing small amounts of data, my software should be able to run multiple weeks/months without memory overflow or leaks while continuously writing relatively small amounts of data, about 15-30 json key-value pairs and no arrays or nested data, just simple JSON objects. I have tried to append seperators and other stuff asynchronously but it doesn't work. TL;DR: So, does lowDB have good performance and doesn't read itself before every new write?

Creative-Difficulty commented 2 years ago

bump

typicode commented 2 years ago

Hi,

It doesn't. It only reads and writes when .read() and .write() is called. Performance is relative, it's fine for some projects and not enough for others, so it's hard to answer this question. Best approach is to do some testing.

Creative-Difficulty commented 2 years ago

So It doesn't read when you call .write() ?

typicode commented 2 years ago

No, just on read()

Creative-Difficulty commented 2 years ago

No, just on read()

I looked into it, I have a function that writes data to the DB every two seconds, (in production it will be running on a windows server and writing ca. every 30 seconds) what gets written to the DB is an array containing all previously written data, for me this data can get very large (up to 10GB) so its not good. My code (very simplified, recreated, ignore possible typos) :

//other imports...
import { Low, JSONFile } from 'lowdb'

const __dirname = dirname(fileURLToPath(import.meta.url));
const file = join(__dirname, 'JSONStorage.json')
const adapter = new JSONFile(file)
const db = new Low(adapter)

db.data ||= []   
var dbData = db.data
var recievedJSON

setInterval(fetchJSON, 2000)

async function fetchJSON() {
    await fetch("url goes here").then(jsonData => jsonData.json()).then(jsonData => {
        recievedJSON = jsonData
    })
    await dbData.push(recievedJSON)
    await db.write()
}

setting dbData to [] after every write breaks the module... I know I could use MySQL or Postgres but I want to keep it simple and stick with lowdb if possible What should I do?

typicode commented 2 years ago

I'd avoid having storing 10GB in a single JSON file. You can split into multiple JSON files (100MB each), it could improve things a bit but still if you write a lot you'll probably hit bottlenecks.

In the end, you'll probably have to come up with optimizations that make sense/work to your particular case.

Personally, I'd go with MySQL or Postgres for something this large.

Creative-Difficulty commented 2 years ago

I'd avoid having storing 10GB in a single JSON file. You can split into multiple JSON files (100MB each), it could improve things a bit but still if you write a lot you'll probably hit bottlenecks.

In the end, you'll probably have to come up with optimizations that make sense/work to your particular case.

Personally, I'd go with MySQL or Postgres for something this large.

I just kinda like wrote my own :)