botpress / nlu

This repo contains every ML/NLU related code written by Botpress in the NodeJS environment. This includes the Botpress Standalone NLU Server.
23 stars 21 forks source link

feat(logs): suggestion for debug logs #51

Closed allardy closed 3 years ago

allardy commented 3 years ago

Hey guys, I took a look at the logging of the application and I have some points i'd like to discuss. The "logFilter" is not a good idea, currently if you filter a type of log and you have an error, it may go unnoticed for a long time and may disrupt the investigation of an issue.

These type of log should always be displayed, whatever the "verbosity" of the app is:

All other informational logs are considered "debug", thus we should definitely stick with the "Debug" library like almost every other projects. I would suggest removing properties "verbose" and "logFilter", and stick with the debug environment variable to handle this. For the CLI, it can use the same format as debug.json on the server

If we go that way, we would have all nlu debug scopes available on the admin panel, and we could toggle them easily, and they will be automatically restored when the server is restarted:

image

The result would be similar to what we have on BP, but it will be clear that it's related to the NLU:

image

Here's the implantation on the server's side: https://github.com/botpress/botpress/pull/5084

I've only touched the logging on package nlu, it could also be changed on the lang server

franklevasseur commented 3 years ago

Hi @allardy

Thanks for the PR, those are valid points and concerns you are mentioning.

I agree with the fact that info, warnings and errors should be always displayed.

I also agree that it is nice to be able to "toggle" a debug scope at runtime. A user with a server running for hours could want to get a little more info.

I'll take a look at your PR in more details and make sure it is available in the next NLU release.

franklevasseur commented 3 years ago

Closing as this is not my current focus.

Those are indeed valid points and concerns, but I'm not sure yet about your solution. We've already discussed few of my objections, but at the risk of repeating myself:

Let's come back to this if we really believe its a pain point. It's not for now.