Closed insomniacc closed 1 year ago
This would be a useful feature.
I've implemented this feature in my own modules in the past in a fairly naïve way. I didn't find it necessary to track the request chain at the point of failure (ie, the current data, the nextLink for the next query, etc) and just elected start the request chain again from the top level.
If you were looking for some inspiration, I've thrown together a quick example (untested) which roughly emulates my approach.
../private/Invoke-HTTPGenericApiQuery .ps1, generic method to communicate with the API, which all the public API functions call
function Invoke-HTTPGenericApiQuery {
param(
$Method,
$Resource,
$Body,
$MaxRetries = 5,
$API_URL # base url for API,
)
$PropertySplat = @{
Headers = $Header # usually I'd inject headers via a private module/script scoped variable
Uri = "$API_URL/$Resource"
Method = $Method
ContentType = "application/json"
}
if ($Method -ne "GET" -and $Method -ne "DELETE") {
$PropertySplat.add("Body", $Body)
}
try {
for ($RetryCount = 0; $RetryCount -lt $MaxRetries; $RetryCount++) {
# can likely transplant your existing code here this is just a naive example
$Responses = [System.Collections.Generic.List[PSObject]]::new()
$Responses.Add((Invoke-RestMethod @PropertySplat))
# implement your nextlink handling / custom filtering / data concatenation logic below
# no HTTP exception was thrown, we don't need to retry :)
return $Responses
}
# We met our MaxRetries
throw("TooManyRetriesError") # can implement your own error types if you like
}
catch {
# Ideally handle all other possible non-HTTP exceptions first
# As an example, if the internal function throws an error that the user hasn't connected yet,
# we'll rethrow to the caller
if ($_ -like "Please connect using*") {
throw($_) # rethrow the error, or your own custom error type to be handled by the caller
}
if ($_ -like "TooManyRetries*") {
throw($_) # rethrow the error, or your own custom error type to be handled by the caller
}
# The implementation of Invoke-WebRequest and Invoke-RestMethod in PS 5.1 is absolutely awful
# You will need to consider forking logic here to support multiple platforms
# A quick and dirty approach would be something like below:
if ($PSVersionTable.PSVersion -like "5.*") {
# It would be best to confirm that $_.Exception actually contains a response first :)
$HTTPResponse = $_.Exception.Response.GetResponseStream()
$SR = New-Object System.IO.StreamReader($HTTPResponse)
$SR.BaseStream.Position = 0
$SR.DiscardBufferedData()
$ResponseBody = $SR.ReadToEnd();
if ($_.Exception.Response.StatusCode -eq 404) {
throw($_) # rethrow the error, or your own custom error type to be handled by the caller
}
if ($_.Exception.Response.StatusCode -eq 404) {
throw($_) # rethrow the error, or your own custom error type to be handled by the caller
}
if ($_.Exception.Response.StatusCode -eq 429) {
$RetryAfter = $_.Exception.Response.Headers["Retry-After"]
Write-Error "HTTP Response 429 - Too many requests, retrying in $RetryAfter seconds (Retry-After HTTP header)"
Write-Error "The HTTP response body was:`n $ResponseBody"
Start-Sleep -Seconds $RetryAfter
}
}
else {
# PS 7/non-Windows PS code here to handle HTTP exceptions
}
}
}
I had a brief read through the related areas of the module with the thought that I might raise a PR to implement handling these cases.
At a glance the parameter templating and the dynamic parameter passthru stuff is cool and pretty clever. The internals of the module are certainly fancier than I expected, so with that in mind I will propose some feedback. I haven't read the module entirely, so I may be missing some context.
There are numerous areas that require the behaviour to handle rate limiting and HTTP errors, and over time as the project grows, I think this will become quite a pain.
I would consider building a wrapper function for Invoke-RestMethod/Invoke-WebRequest, implementing the rate limiting there, and then using that function within your base Invoke-SNOWTable*
functions, and ensure to set a MaxRetry default parameter and value and pass it through the chain of methods. This would give you a single place to maintain all your rate limiting logic and reduce a lot of future mess. You could even use this wrapper method within your BatchProcessing functions as an example.
From there, I guess you can decide if you want to pass a MaxRetries param (with a default value) from the calling function in the chain or set (and allow the user to set) a module wide MaxRetries setting.
It might be worth reconsidering the approach to batch processing, but keep in mind I haven't read into this area in great detail. If you implemented a request wrapper, maybe it's worth handling batch processing within the wrapper function? To me it would result in code that flows more naturally, and shares the benefit of the rate limiting protection (and any other shared behaviour all HTTP requests require) without having to repeat much code.
I assume that in future, more and more "helper" functions (such as *-SNOWRITMVariableSet
) are likely to be added to the module, and they will require some sort of composition (ie: creating, fetching or updating data in multiple tables).
It might be a good idea to establish stricter layers in the module:
A "data" layer that just wraps the CRUD functionality of the actual ServiceNow REST API endpoints Essentially, this is any code that directly maps CRUD updates on atomic tables.
A "service" layer that is responsible for any composition/aggregation. Functions created that perform multiple API actions on the platform should strictly achieve this by chaining calls to concrete implementations in the data layer.
*-SNOWRITMVariableSet
as an example, these functions directly call *-SNOWObject
instead of having a concrete implementation.
It makes sense to default to having a concrete implementation for all tables used in the module. It reduces the need to later refactor for DRY reasons when you inevitably find duplicated cases of unimplemented CRUD actions being performed by calling *-SNOWObject
, will make it easier to handle API changes by ServiceNow, etc.
Furthermore, with the way templated and dynamic parameters are being passed around, keeping the approach consistent will probably help to keep things simple and predictable when changing things.
Thanks for the feedback again! it's appreciated. I guess that I never implemented rate limit handling in this module because I never personally needed it but it was always on my radar. SN is extremely generous with rate limiting generally, and I've typically always used the Batch API for any large groups of calls to avoid issues. But the module certainly should have this feature.
I would consider building a wrapper function for Invoke-RestMethod/Invoke-WebRequest
Completely agree on your approach here, I actually did just that with another module I've built in my day job, trying to add it to the existing Invoke-SNOWTableRead would over complicate things too much so keeping it separate is the way to go, I'll see about trying to add this in at some point this weekend!
Your comments about structure make sense too, just going back over to revise the way things are organised will help clean things up so it's good to have a second pair of eyes to point these things out!
A useful enhancement would be to auto handle rate limiting errors, retry x times with the backoff provided in the
X-RateLimit-Reset
orRetry-After
header, if the request still cannot be sent it should throw an exception.https://docs.servicenow.com/bundle/sandiego-application-development/page/integrate/inbound-rest/concept/inbound-REST-API-rate-limiting.html