kostya / eye

Process monitoring tool. Inspired from Bluepill and God.
MIT License
1.19k stars 89 forks source link

Duplicate logs with semantic_logger #231

Closed rgaufman closed 4 years ago

rgaufman commented 4 years ago

I am using semantic logger in my .eye file like this:

SemanticLogger.default_level = :info
Eye::Logger.log_level = Logger::INFO
SemanticLogger.add_signal_handler
SemanticLogger.add_appender(file_name: File.join(__dir__, 'log/eye.log'))
LOGGER = SemanticLogger['eye']

Eye.config do
  logger LOGGER
end

cameras = fetch_cameras
LOGGER.warn "Starting eye with cameras: #{cameras}"
cameras.each do |camera|
    Eye.application "recorder_#{camera[:camera_id]}" do
      working_dir PATHS[:tmp]
      ...

However, every time I run eye load recorder.eye, I am seeing a duplicate log messages, any ideas what I'm doing wrong?

kostya commented 4 years ago

i have not idea how semantic_logger works, if you will use Logger.new it would work?

rgaufman commented 4 years ago

I'll experiment and see if I can figure something out - I now have a new issue where logging just stops after 30 seconds or so on this machine, not sure why, even when I remove semantic_logger completely, so something else is happening, trying to figure out what, will reply soon.

rgaufman commented 4 years ago

ok, so every time I run:

bundle exec eye l recorder.eye

I get 1 extra duplicate line, so if I run 3 times, I get 3 duplicate lines, if I run 10 times, I get 10 duplicate lines for every line.

When switching to Logger.new('/tmp/eye.log') this does not happen, hmm.

Do you have any suggestions for something a bit better than Logger.new that doesn't have the duplicate issues issue with semantic_logger?

kostya commented 4 years ago

may be change would fix it:

unless defined?(LOGGER)
  SemanticLogger.default_level = :info
  Eye::Logger.log_level = Logger::INFO
  SemanticLogger.add_signal_handler
  SemanticLogger.add_appender(file_name: File.join(__dir__, 'log/eye.log'))
  LOGGER = SemanticLogger['eye']
end
rgaufman commented 4 years ago

Yes, that fixed it, thank you!