davidavdav / GaussianMixtures.jl

Large scale Gaussian Mixture Models
Other
99 stars 38 forks source link

Support arbitrary loggers #92

Open Djoop opened 3 years ago

Djoop commented 3 years ago

Hi, in the current version of the package in train.jl there are these few lines

    if Logging.global_logger().min_level ≤ Logging.Debug
        loglevel = :iter
    elseif Logging.global_logger().min_level ≤ Logging.Info
        loglevel = :final
    else
        loglevel = :none
    end

which prevent from using the module with e.g. loggers from LoggingExtras (which do not all have a min_level attribute). As it seems that this loglevel is only used to call the Clustering package, wouldn't it be cleaner to simply leave Clustering deal with its logging level? Or is there a standard way in the API to do that? I see in the docs that there is a min_enabled_level which might help, not sure if it's exactly the same though.

ForceBru commented 3 years ago

This also doesn't let me silence the logger. I need to fit thousands of mixtures and don't want any output while fitting individual mixtures - otherwise there's too much output, and it also slows everything down.

I tried to use a logger with a very high level, but it still prints logs from k-means:

with_logger(SimpleLogger(stdout, Logging.LogLevel(50))) do
    GMM(N_COMPONENTS, data, nIter=1000)
end

This outputs:

K-means converged with 11 iterations (objv = 47.97814024236598)

Looks like I can't influence that log level because the code is querying the global logger directly with Logging.global_logger()


I found a way of silencing logging by temporarily changing the global logger:

prev_logger = global_logger(SimpleLogger(devnull, Logging.LogLevel(50)))
result = run_many_gmms(GMM, data, N_COMPONENTS)
global_logger(prev_logger);
davidavdav commented 2 years ago

I never understand these logging interfaces. If you have a more satisfactory solution, please submit a PR.