pinojs / pino

🌲 super fast, all natural json logger
http://getpino.io
MIT License
14.21k stars 875 forks source link

pretty logs with docker #178

Closed JamesKyburz closed 7 years ago

JamesKyburz commented 7 years ago

When running docker-compose logs -f | pino -l nothing is formatted.

I got it working when I stripped everything prior to first "{"

Before sending a PR fixing this was after tips so I don't break other formatters :)

jsumners commented 7 years ago

I have no idea what the output of docker-compose logs -f looks like so I can't say if Pino would even parse them. Please provide some sample output.

davidmarkclements commented 7 years ago

yes - can we see some sample output

Are you saying that you're logging with pino in processes, wrapped in containers, and then using docker-compose to amalgamate the logs from all of those containers, and then prettify that output?

JamesKyburz commented 7 years ago

@davidmarkclements exactly

Here is the raw docker-compose log

> docker-compose logs -f --tail 100 fontello
Attaching to dpd2_fontello_1
fontello_1         | > node src/index
fontello_1         |
fontello_1         | {"pid":19,"hostname":"4de4202901f3","name":"fontello-server:httpserver","level":30,"time":1484855262728,"msg":"running on http://localhost:2016","v":1}

Here is when the logger uses pino.pretty()

> docker-compose logs -f --tail 100 fontello
fontello_1         | > node src/index
fontello_1         |
fontello_1         | [2017-01-19T21:57:16.070Z] INFO (fontello-server:httpserver/18 on 4de4202901f3): running on http://localhost:2016

And here is when I pipe docker-compose to pino, as you can see the output is not formatted.

> docker-compose logs -f --tail 100 fontello | pino -l
Attaching to dpd2_fontello_1
fontello_1         | > node src/index
fontello_1         |
fontello_1         | {"pid":19,"hostname":"4de4202901f3","name":"fontello-server:httpserver","level":30,"time":1484855262728,"msg":"running on http://localhost:2016","v":1}

So really it's only the prefix that's the issue that makes it invalid json in this case.

jsumners commented 7 years ago

Personally, I think that's a use case outside of the scope of the prettifier. The prettifier is mean to parse newline delimited JSON. This input is not that.

JamesKyburz commented 7 years ago

@jsumners Yeah adding something specific to docker for pino does feel wrong.

Maybe I'll publish another module that strips docker logs prior to piping to pino

Anyways, thanks and closing issue.

davidmarkclements commented 7 years ago

I agree - I think that's the best approach @JamesKyburz (unless there's a way with docker-compose to modify the logging). Another option would be to see if there's an alternative to docker-compose for aggregating docker logs (@mcollina actually wrote something along those lines I think, before the API's changed though)

In any event what would be really nice, once you've written it to add it to the docs, because this doesn't just work with prettifying, it works with transports as well I think it's worth documenting

Also - we can help with the perf side of things once built if you like

sentientmachine commented 5 years ago

pino-pretty plays nicely with docker container log prettifying:

#Produce a dump of container names plus json.  This output is specific to docker:
docker-compose logs --tail 1 yourcontainer

#Cut off the un-necessary parts, to yield a pure json string:
docker-compose logs --tail 1 your_container | cut -d"|" -f2

#Inside that json string is the json key and value to prettify, pipe it to pino-pretty:
docker-compose logs --tail 1 your_container | cut -d"|" -f2 | pino-pretty -m yourjsonkey

pino-pretty parses the docker container's json, extracts the key you request from json model, formats \n to newlines, colors it according to its default highlighting scheme, and prints to stdout. It's appearance is somewhat like what python does by default at compile time with syntax errors and run time exceptions.

github-actions[bot] commented 2 years ago

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.