munkireport / munkireport-php

A reporting tool for munki
MIT License
393 stars 138 forks source link

Possible enhancement request to user_sessions module #1332

Open jelockwood opened 4 years ago

jelockwood commented 4 years ago

A number of years ago Apple moved away from standard log files to a more database driven approach and this needs to use special tools to search for log entries. This can work but knowing the correct criteria is often not obvious.

I can see MunkiReport has a built-in module for listing user_sessions https://github.com/munkireport/user_sessions/

@tuxudo @clburlison @pudquick

Unfortunately it seems whilst this is mostly what I am looking for it does not seem to include listing any failed authentication attempts.

Would it be possible to add listing failed login attempts and what they relate to e.g. screensaver, login, ssh etc.

tuxudo commented 4 years ago

That information is contained within the unified logging database. I have never used that with a module because it often takes far too long to return data to be used in a MunkiReport script. Had the logs still be actual log files, I would incorporate this into the user_sessions module

jelockwood commented 4 years ago

@tuxudo I thought you might already be using the unified logging database for the information you already provide. I personally regard the unified logging database as a horrible system. I can see some benefits but overall it seems more pain than gain.

It also seems worse than even you may be aware as normally usernames are not included and are obviously needed. See - https://apple.stackexchange.com/questions/322763/how-do-i-see-all-my-failed-login-attempts-macos-high-sierra

If you were to search the unified logging database roughly every 60 minutes during normal MunkiReport cycles and specify a search period also of 60 minutes would this be sufficiently fast to be practical?

tuxudo commented 4 years ago

That may be sufficiently fast, but it will have to be fast enough on even the slowest of systems. If you are able to create a command that is fast enough on my test systems and returns the needed data, I'll look into incorporating it into the module.

jelockwood commented 4 years ago

@tuxudo Ok The following seemed fast enough for me although it was not the slowest of Macs. However I still feel it will be acceptable even on a slower Mac so it will be worth you comparing and taking a decision.

I am however still having problems getting it to show the username instead of <private>, perhaps you will have more success.

Note: I am running Mojave 10.14.6 and I am aware that a config profile is needed to enable the private data in Catalina as per https://medium.com/@boberito/private-data-in-unified-logging-10-15-9eb2b4be5c40

I am willing and able to push out a suitable config profile.

I used the following command to show alleged failed logins.

sudo log show --predicate '(eventMessage CONTAINS "Authentication failed")' --style syslog --last 1h

It took 2-3 seconds at most.

tuxudo commented 4 years ago

I'll test that out tomorrow on some of my slower test Macs. The command needs to return data in no more than 5 seconds. Doing it this way also leave the possibilities of having holes in data. This isn't really acceptable for data like this. Depending on how the admin has the module configured, it would only show the last hour's worth of data instead of all the data.

jelockwood commented 4 years ago

Yes your right there could be a possibility of gaps. For that you could consider the following.

  1. On first use use the 1h hour command and save a time stamp
  2. On next run specify an interval equivalent to the amount of time since previous run

If you want to ensure an overlap you could add an additional minute but would then need to add logic to spot any duplicates.

I suggest only doing 1 hour for the first run because you don't capture old data for new clients already.

tuxudo commented 4 years ago

What if the client doesn't report back to the server on hours 2 and 5? You'd lose those two hours of data. Having the logic to append unique data to the cache file on the client would add to the runtime of the script. With it searching through the logs, it's already getting close to the limit before the script is killed.

jelockwood commented 4 years ago

That would be the point of saving a time stamp of the last successful run so that you can do a search covering an arbitrary period between 60 mins and the last successful run e.g. 2, 3, etc. hours. Since you can run the search command to get data covering whatever time window you need you would only need to do a search to find the last/overlapping entries which could be done in the same loop that examines the results to format them in to a form suitable for submitting to MunkiReport.

joncrain commented 4 years ago

Looks like this hasn't been updated, but @tuxudo did create a new module that includes this. https://github.com/munkireport/users It is already in core as well.