Open igabriel-olint opened 5 years ago
Seconding this
I would appreciate this as well
I concur. It really bothers me that I have to have 20 JSON files for 20 keys, instead of being able to traverse one file with 20 keys within it.
I concur. It really bothers me that I have to have 20 JSON files for 20 keys, instead of being able to traverse one file with 20 keys within it.
where is the problem? append your keys to a existing file:
let newKeys = ['foo', 'bob'];
storage.get('myJsonFile', function (error, settings) {
if (error) throw error
settings.keys.push(...newKeys);
storage.set('myJsonFile', settings, function (error) {
if (error) throw error
})
})
if you want to append to a file having data in array of objects form [ {} ,{} ,{} ]. then for now we can do is
storage.get('yourfile',(error, data) =>{
if (error) throw error;
data.push(newData);
storage.set('yourfile', data,(err) => {if (err) throw err});
});
if the file is fresh. write an emtpy array [] to it to avoid any errors.
@Serjeel-Ranjan-911 Sounds interesting and simple enough. Perhaps you would be interested in sending a PR yourself, or at least an initial draft?
@Serjeel-Ranjan-911 Sounds interesting and simple enough. Perhaps you would be interested in sending a PR yourself, or at least an initial draft?
@jviotti this is not an efficient method. It's a temporary fix for people making it work. This would be bad for very large files.
@Serjeel-Ranjan-911 I agree, but sadly you can't do anything more with JSON with the parsers available in Node.js. Most JSON parsers require the entire document to be parsed before any modification, and then written back again. If your use case involves appending a lot of data to a list, I'd recommend using something other than JSON.
Eg. storage.append(keyname,data,callback)