pinojs / pino

🌲 super fast, all natural json logger
http://getpino.io
MIT License
14.21k stars 875 forks source link

Can we work with Mongo? #109

Closed viktor-ku closed 7 years ago

viktor-ku commented 8 years ago

May be we can add a transport or extend pino-socket to use it with MongoDB which will be a brilliant? Thanks

mcollina commented 8 years ago

I think a pino-mongodb module is in order, you can get inspired by https://github.com/mcollina/pino-elasticsearch.

Send a PR when it's done, we'll add it to the README.

davidmarkclements commented 8 years ago

at @ViktorKonsta if you need any assistance making this, just reach out :bowtie:

viktor-ku commented 8 years ago

@davidmarkclements Thank you. I will try today I think

viktor-ku commented 7 years ago

Oooops xD Not today right? A month. But I did something, at list it takes to stdin and insert documents to where we need it. Check this out

Note Guys do not kill me because of my code xD I wrote this within two hours so I will work on it and do some tests as well.

davidmarkclements commented 7 years ago

awesome!

Oooops xD Not today right? A month.

literally me all the time

Guys do not kill me because of my code

looked through, your code is clear and concise, you've made the right dep choices (raw mongo driver, instead of mongoose or other)

you will run into one problem quite quickly though - the fact your transpiling may make it difficult to improve your perf, and you'll probably find that after you've gone through all the pain of optimizing whilst transpiling it'll all be es3 or es5 at best (much of es6 is slow, transpilation causes it's own overhead too)

viktor-ku commented 7 years ago

@davidmarkclements I can try my best to write this with performance in mind. How can I benchmark speed?

And I have also updated it right 1 minute ago to stdout logs back so we can continue pipe this to something else.

viktor-ku commented 7 years ago

I rewrote whole module. Want you attach it to the pino group somehow or what should I?

davidmarkclements commented 7 years ago

I'd love to have it on pinojs repo if @mcollina agrees - for consistency it'll need a few tweaks

  1. we use standard accross all repos for linting
  2. it'll need benchmarks, see how we do benchmarks (here's an example https://github.com/pinojs/pino-noir/blob/master/benchmarks/index.js)
  3. every description is prefixed with 🌲
  4. it'll need tests - min 90% cov, pref 100%
  5. all tests so far have been written with tap - be good to keep that for consistency
  6. could you add a LICENSE section to the readme (glad you're using MIT!)

one final point, pino-mongodb.js is now redundant - your entry point is index.js so you can either remove or rename index.js to pino-mongodb.js and set that as main in package.json (either is fine for me)

@mcollina we should probably add a doc that outlines this process using this issue as starting point

davidmarkclements commented 7 years ago

oh one other point - is there any case where a doc returned from mongodb has a circular reference? If not it'll be faster (and just as safe) to use JSON.stringify directly

viktor-ku commented 7 years ago

@davidmarkclements I will do it on the next week definitely. Although some tools are new for me I will try my best.

And ok I can return to the JSON method as this module is stdout everything that was stdin. There are pino logs that has not circular reference as I know.

mcollina commented 7 years ago

Wow!

Some notes:

a) There is no need to call write, we can probably just do process.stdin.pipe(process.stdout) b) do not use on('data'), as it will not support backpressure: writing to mongo might be slower than the amount of logs that your app is producing. You can use streams! See how it is done in pino-elasticsearch https://github.com/pinojs/pino-elasticsearch/blob/master/pino-elasticsearch.js#L37-L83. (writev is for bulk writes, it increases the ingestion speed drammatically. c) 'use string'  at the beginning is missing d) tests are missing :(, they are extremely important, especially to guarantee that the library works across different versions of MongoDB (that has been a problem in the past for me).

mcollina commented 7 years ago

And ok I can return to the JSON method as this module is stdout everything that was stdin. There are pino logs that has not circular reference as I know.

Yes, you are guaranteed that there are no circular references here.

viktor-ku commented 7 years ago

Alright. Thank you guys for advices. I will need some time with my job: probably at the end of this week I will show something.

jsumners commented 7 years ago

And ok I can return to the JSON method as this module is stdout everything that was stdin. There are pino logs that has not circular reference as I know.

Yes, you are guaranteed that there are no circular references here.

Well, unless .safe = false. In that case, I don't think you are guaranteed of anything.

mcollina commented 7 years ago

Well, unless .safe = false. In that case, I don't think you are guaranteed of anything.

that will blow up the source/application, not the transport. After the transport has parsed the JSON, we can be sure that there are no circular references, because otherwise it won't be valid json.

davidmarkclements commented 7 years ago

the question was about circ refs in mongo docs - thats what's being stringified

the input stream should use fast-json-parse in case of corrupt input

viktor-ku commented 7 years ago

I will remove any stringify and fast-safe-stringify in particular because as input we use stdin and there are nothing but logs.

The other question is can potentially logs be unsafe and answer is we do not know what user can write into logs, so yes. Therefore I will parse input with fast-json-parse

jsumners commented 7 years ago

What is the resolution plan for this issue? I see that pino-mongodb is published.

mcollina commented 7 years ago

I think we are good to go. We can close this, and maybe link pino-mongodb from the docs?

davidmarkclements commented 7 years ago

great job @ViktorKonsta

jsumners commented 7 years ago

@ViktorKonsta would you like to add your module to https://github.com/pinojs/pino/blob/master/docs/transports.md ?

viktor-ku commented 7 years ago

There are no benchmarks because they are really depends on MongoDB writes. And if it's ok to you then I would like to add this module to the transports. How we can handle npm package as I have published it?

davidmarkclements commented 7 years ago

Even at that, benchmarks would still be useful, even for comparing transport tradeoffs.

If you'd be up for including benchmarks + having a peer review once benchmarks are added we could even make it an official transport (part of pinojs org) - with you as lead maintainer.

jsumners commented 7 years ago

Regardless of it being in the org or not, we currently operate by the "lead maintainer" publishing their modules to their npm account. If it's in the org, then I think the plan is to have the core team added as collaborators on npm for such modules (but we haven't really done that yet).

viktor-ku commented 7 years ago

We can add link to this module and eventually I will add benchmarks then open another PR (or reopen this)?

mcollina commented 7 years ago

@ViktorKonsta go ahead 👍

davidmarkclements commented 7 years ago

closing for now - @ViktorKonsta let us know when ready! :bowtie:

viktor-ku commented 7 years ago

Hello again. I have time to finish this module and it would be great if we will include it to the pino group. Could you explain me what type of benchmarks are required? pino- elasticsearch do not have any. pino-noir is basically another type of module and because it interacts with pino interface we can benchmark it. pino-mongodb even do not have pino interface so how do I benchmark it?

mcollina commented 7 years ago

@Kuroljov let's forget about benchmarks atm. I'll add you to the organization, feel free to move your project when ready.

github-actions[bot] commented 2 years ago

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.