Closed maku closed 1 month ago
Thanks for opening this issue @maku ,
json
handling is something I have been a bit cautious of getting into, since I've been wanting the scope of tailspin
to be just highlighting of regular log files. To understand the use case better, what are you looking for tailspin
to do with json
files that other tools (like jq
) don't provide today?
@bensadeh I have an application that generates log data in JSON format (rolling file where the current daily data is logged into a separate file which is then archived in a folder every day). I'm looking for a simple app that can handle this with some kind of aggregation and present the JSON data and possibly also handle a search.
The format of the entries in the log file is:
{"requestId":"e2770848-0859-4e8d-8348-8ca7717fc5d9","uuid":"14ad72e3-16de-4c0c-ad72-e316de5c0c14","timestamp":"2024-04-08 10:27:31,102","message":"xxxxx","level":"ERROR","thread":"tomcat-handler-384","exception":"xxxx"}
(it is not valid json because the records are not in an array)
Thanks for explaining your use case in more detail.
This use case actually plays well into tailspin
's architecture, because it works on one line at a time. In other words, we can parse these lines as a valid json
, however, the challenge (for me) is how to handle it in a maintainable way (and without going into formatting territory). After all, I want tailspin
to remain a highlighter. Maybe a middle ground is still useful.
For example, I've been toying with the idea of "flattening" simple jsons without any nested objects like in the example you provided by converting them into key value pairs separated by equals signs. The result would be something like this:
This would make the json entries more readable, and you could still search for entries and do anything else less
supports.
(Side note: if you need aggregation and more advanced features for your log files, you might find tools like lnav
more helpful)
Thanks for explaining your use case in more detail.
This use case actually plays well into
tailspin
's architecture, because it works on one line at a time. In other words, we can parse these lines as a validjson
, however, the challenge (for me) is how to handle it in a maintainable way (and without going into formatting territory). After all, I wanttailspin
to remain a highlighter. Maybe a middle ground is still useful.This would make the json entries more readable, and you could still search for entries and do anything else
less
supports.(Side note: if you need aggregation and more advanced features for your log files, you might find tools like
lnav
more helpful)
This really looks great. With some structured log libs out there syslog messages may also contain JSON.
Sep 14 22:09:32 myhost my-http[20662]: {"level":"debug","request_id":"FFGAdAivbdzqCEMiJEDgMbuxKiZixmSz","id":19,"url":"http://example.com:9082","system":"dev","type":"portal","time":"2024-09-14 22:09:32.051473","message":"Picked upstream"}
Sep 14 22:09:32 myhost my-http[20662]: {"level":"debug","request_id":"FFGAdAivbdzqCEMiJEDgMbuxKiZixmSz","id":19,"system":"dev","type":"portal","url":"http://example.com:9082","time":"2024-09-14 22:09:32.051481","message":"UpstreamTypeBalancer: Found upstream"}
If tailspin is able to reformat that would be great.
The example above would need some kind of formatting though (special handling for level
and message
):
Sep 14 22:09:32 myhost my-http[20662]: DEBUG UpstreamTypeBalancer: Found upstream level=debug request_id=FFGAdAivbdzqCEMiJEDgMbuxKiZixmSz id=19 system=dev type=portal url=http://example.com:9082 time=2024-09-14 22:09:32.051481
I decided to move away from formatting altogether.
From 4.0.0
tailspin
will recognize JSON
s which should help with the formatting and highlighting in general.
it would be great when tailspin is able to handle log files in json format