sul-dlss / folio_client

Interface for interacting with the Folio ILS API.
Other
0 stars 0 forks source link

Add method(s) to create source record, instance, and holding from a MARC record #5

Closed lwrubel closed 1 year ago

lwrubel commented 1 year ago

This is a pre-req for uploading a ETD stub MARC records.

The consumer (e.g. hydra_etd) should not need to know all of these calls, but be able to call one method. The client can take care of the steps below.

Uploading a MARC record to FOLIO involves at least three API calls.

1) Create an uploadDefinition with file definitions (minimally, filenames) as the body:

From the response, save id as the uploadDefinitionId used in the URI of the next two calls. Save fileDefinitions.id for use in the URI in the next step to upload file. Example response:

{'id': 'd39546ed-622b-4e09-92ca-210535ff7ab4',
 'metaJobExecutionId': '4ba4f4ab-d3fd-45b1-b73f-f3f0bcff17fe',
 'status': 'NEW',
 'createDate': '2023-02-14T13:16:19.708+00:00',
 'fileDefinitions': [{'id': '181f6315-aa98-4b4d-ab1f-3c7f9df524b1',
   'name': 'before_ils.marc',
   'status': 'NEW',
   'jobExecutionId': '4ba4f4ab-d3fd-45b1-b73f-f3f0bcff17fe',
   'uploadDefinitionId': 'd39546ed-622b-4e09-92ca-210535ff7ab4',
   'createDate': '2023-02-14T13:16:19.708+00:00'}],
 'metadata': {'createdDate': '2023-02-14T13:16:19.474+00:00',
  'createdByUserId': '297649ab-3f9e-5ece-91a3-25cf700062ae',
  'updatedDate': '2023-02-14T13:16:19.474+00:00',
  'updatedByUserId': '297649ab-3f9e-5ece-91a3-25cf700062ae'}}

2) Upload files

3) Request to process files

Returns a 204 if data-import successfully started. (Does not return a response with whether import completed and if there were errors.)

4) Check status of job To check the status of the response, get the jobExecutionId from the uploadDefinition and GET /metadata-provider/jobSummary/{job_id}

A successful response will be something like:

{'jobExecutionId': 'b91a6f3e-c99f-44b9-8bb4-6b698516cf3d',
 'totalErrors': 0,
 'sourceRecordSummary': {'totalCreatedEntities': 1,
  'totalUpdatedEntities': 0,
  'totalDiscardedEntities': 0,
  'totalErrors': 0},
 'instanceSummary': {'totalCreatedEntities': 1,
  'totalUpdatedEntities': 0,
  'totalDiscardedEntities': 0,
  'totalErrors': 0},
 'holdingSummary': {'totalCreatedEntities': 1,
  'totalUpdatedEntities': 0,
  'totalDiscardedEntities': 0,
  'totalErrors': 0}}

Unsuccessful:

{'jobExecutionId': '5f30ce73-b888-4388-ba0d-a0d7cec4d1d2',
 'totalErrors': 1,
 'sourceRecordSummary': {'totalCreatedEntities': 0,
  'totalUpdatedEntities': 0,
  'totalDiscardedEntities': 1,
  'totalErrors': 1}}

See example scripts This can be used to load records successfully, by setting up an ~/.okapi file and changing the jobProfileInfo in the script in step 3 (@lwrubel has a working example):

Another example (haven't tested this one, but it has the same methods)

ndushay commented 1 year ago

I certainly don't know as much as Laura about this but ... wondering if we need the holding record, or just source and instance.

ahafele commented 1 year ago

The instance and holding will be created by the job profile.