Open Djoop opened 3 years ago
This also doesn't let me silence the logger. I need to fit thousands of mixtures and don't want any output while fitting individual mixtures - otherwise there's too much output, and it also slows everything down.
I tried to use a logger with a very high level, but it still prints logs from k-means:
with_logger(SimpleLogger(stdout, Logging.LogLevel(50))) do
GMM(N_COMPONENTS, data, nIter=1000)
end
This outputs:
K-means converged with 11 iterations (objv = 47.97814024236598)
Looks like I can't influence that log level because the code is querying the global logger directly with Logging.global_logger()
I found a way of silencing logging by temporarily changing the global logger:
prev_logger = global_logger(SimpleLogger(devnull, Logging.LogLevel(50)))
result = run_many_gmms(GMM, data, N_COMPONENTS)
global_logger(prev_logger);
I never understand these logging interfaces. If you have a more satisfactory solution, please submit a PR.
Hi, in the current version of the package in
train.jl
there are these few lineswhich prevent from using the module with e.g. loggers from
LoggingExtras
(which do not all have amin_level
attribute). As it seems that this loglevel is only used to call the Clustering package, wouldn't it be cleaner to simply leave Clustering deal with its logging level? Or is there a standard way in the API to do that? I see in the docs that there is amin_enabled_level
which might help, not sure if it's exactly the same though.