jacebenson / jace.pro

A blog about servicenow and other technology
https://jace.pro
19 stars 17 forks source link

Flesh out SNUG Lab Due 2/28 (Integrate repeatably faster by following these steps) #43

Closed jacebenson closed 5 years ago

jacebenson commented 5 years ago

What is the post about? Integrate repeatably faster by following these steps

What things would help with writing the post https://blog.jacebenson.com/post/2018-12-02-k19-proposals/#integrate-repeatably-faster-by-following-these-steps

jacebenson commented 5 years ago

image

jacebenson commented 5 years ago

In short order we've integrated Azure intune (for device tracking), solarwinds for network equipment, and crashplan for backups with very minimal work. Then when those different integrations wanted to cahgne something we spent a little extra time on each of them to ensure we could iterate quickly by being consistent and using import sets.

I've been working with servicenow for a long time, and even I am tempted to write directly to the table. However writing direclty to the table is bad because over time the table changes.

All of these can cause the integration to fail at a later date. However, you have nothing to test to verify the integration is working if you're writing straight to the table except the lack of a record update. Even then the update may be by some other integration.

For these reasons I suggest using import sets for all the integrations where you can. By doing an import set here's the quick benefits;

  1. You can insert all the details from the integration, you're not limited to the columns of the target table
  2. You can massage the data from the other system
  3. You can easily add/remove data by changing your import set instead of having to re-read and set up your api calls.
jacebenson commented 5 years ago

In this lab we're going to pull users in. I can imagine a number of systems where the users could be beneficial to have them in servicenow;

In anycase, we're going to be using a simple and free non-sn resource to get the users with lots of data.

https://randomuser.me/api/?page=3&results=10&seed=sn

jacebenson commented 5 years ago

The fields returned are;

Now a lot of these fields don't mean anything to us yet, but initially say the request is to pull in their email and name. I'd still pull in all the details in initially so you don't have to mess with the import later;

You can easily convert JSON to excel for the import here; https://codebeautify.org/json-to-excel-converter Then just replace results.0. with nothing.

jacebenson commented 5 years ago

We could pull in just name.title + name.first + name.last However, if we pull all parts of this, we don't have to consider the object until later and we can ask the stake holder if there's other data they want that they also might not even know is available.

jacebenson commented 5 years ago

k.pptx

jacebenson commented 5 years ago

If you don't want to run a transform until you load all the record, you'll need a global script include to allow access to GlideImportSetTransformerWorker. Otherwise you'll get this if you try to execute code to do it.Evaluator: java.lang.SecurityException: GlideImportSetTransformerWorker is not allowed

Flatten the import data.

Use Import sets, always. Pull in all available data to import set

jacebenson commented 5 years ago

https://codebeautify.org/json-to-excel-converter