Open zerolaser opened 2 years ago
Hi @zerolaser i will try and work on this.. in the meantime, pd rest fetch
can fetch audit records and give you the results as tables; see the REST API reference for the endpoints, parameters and limitations, but here is an example getting one month (the maximum you are allowed to do in a single query) of user records, and displaying it as a table with some of the more relevant information:
pd rest fetch -e audit/records -P since=$(pd util timestamp 1/1/2022) -P until=$(pd util timestamp 2/1/2022) -P 'root_resource_types[]=users' -t -k execution_time -k 'actors[*].summary' -k action -k root_resource.summary -k 'details.references[*].name'
Can you let me know more specifically what you'd be looking to see so that I can incorporate it into my work as I explore this?
@martindstone Thanks to your pagerduty-cli, Before this I used pdpyras to loop through pagination of results. With your pd-cli I can get all the results at once and then query further based on results. You just made my life simpler.
I currently use your cli for everything except to get this user create date/delete date and for that I'm depending on pdpyras. These are following that I need to pass actions, since, until and root_resource_types[]
payload = {
'cursor': next_cursor,
'actions[]': ["create", "delete"],
'since': '2022-08-10',
'until': '2022-09-10',
'root_resource_types[]': 'users'
}
I'm looking for a way from pd-cli to get user create/delete date. I understand that:
Also, thank you for pd rest
example above. I'll use that to get the results I need.
hi @zerolaser makes sense... take a look at the following simple Python script and let me know if it could be a good starting point:
#!/usr/bin/env python3
import sys
import json
import subprocess
import datetime
import csv
from dateutil.relativedelta import relativedelta
import argparse
parser = argparse.ArgumentParser('Get users created/deleted')
parser.add_argument('-o', '--output_file', help="File to write CSV results", required=True)
args = parser.parse_args()
def runcmd(cmd):
r = subprocess.run(cmd, shell=True, capture_output=True)
return json.loads(r.stdout)
now = datetime.datetime.now()
nowiso = now.isoformat()
start = datetime.datetime(2021, 8, 1, 0, 0, 0)
records = []
while start < now:
end = start + relativedelta(months=1)
print(f"Getting audit records from {start.isoformat()} to {end.isoformat()}...", file=sys.stderr)
r = runcmd(f"pd rest fetch -e audit/records -P since={start.isoformat()} -P until={end.isoformat()}" +
" -P 'root_resource_types[]=users' -P 'actions[]=create' -P actions[]=delete")
records.extend(r)
start = start + relativedelta(months=1)
rows = []
for r in records:
rows.append({
'action': r['action'],
'execution_time': r['execution_time'],
'name': r['root_resource']['summary'],
'id': r['root_resource']['id'],
})
f = open(args.output_file, 'w')
writer = csv.DictWriter(f, rows[0].keys())
writer.writeheader()
writer.writerows(rows)
Thanks...
Audit API is required to monitor pagerduty resources. Key area of usage is to: