IBM / dbb

The helpful and handy location for finding and sharing example IBM Dependency Based Build (DBB) scripts and snippets.
Apache License 2.0
55 stars 130 forks source link

API for MVS file management #150

Open FALLAI-Denis opened 1 year ago

FALLAI-Denis commented 1 year ago

Hi,

IBM DBB APIs expose functions for transfers between MVS data sets and USS file: CopyToHFS, CopyToPDS. The Groovy language allows direct management of USS files.

In our build context, we need to manage MVS datasets:

The IBM DBB API does not expose any method for doing these operations.

There remains the possibility of using IPSFExec or TSOExec objects, but it would be easier to have APIs dedicated to the management of MVS datasets, especially since these APIs exist at the JZOS level.

FALLAI-Denis commented 1 year ago

Hi,

zAppbuild use direct access with ZFile JZOS API.

Does this support attributes "output()" and "deploytype()" of DBB API for DDStatement ?

dennis-behm commented 1 year ago

Hi @FALLAI-Denis ,

indeed you can use JZOS do perform operations from USS (or from your dbb scripts). This is also the API which is used in the DeletePDS sample utility or in deleting a build output.

I had responded to your query https://github.com/IBM/dbb-zappbuild/issues/279 about the different Report types, which are supported and available. ZFile operations are not producing any records. But you can create your record by using the AnyTypeRecord or PropertiesRecord. See the implementation in zAppBuild about documenting deletions.

FYI - This repository ibm/dbb is not monitored for enhancement requests. Please open enhancement requests on the Ideas Portal for DBB about additional APIs in DBB.

For your scenario of copying from MVS dataset to dataset, you can use MVSExec along with IEBCOPY. Would't that be the easiest way to implement it? What is the particular process your are looking for?

If you are looking for a sample how to copy build outputs to another library, please drop me a message/email. I might have a sample for that.

 /**
 * NAME:
 * 
 * Post DBB build script to IEBCOPY build outputs to other target datasets
 * 
 * DESCRIPTION:
 * 
 * This script reads the DBB Build Report and identies the declared build outputs, 
 * based on a mapping file of DBB deployType to target last-level qualifier it assembles an MVSExec
 * for IEBCOPY to copy the outputs to a different HLQ.
 *
FALLAI-Denis commented 1 year ago

Hi @dennis-behm

Thank you for your reply.

I tried to implement an IEBGENER by MVSExec but I systematically obtained an RC=12 at runtime without understanding why (I did not manage to capture a log). Using IEBCOPY seemed too complicated to me because of the sysin to format. I would like to have an example of implementation of IEBCOPY by MVSExec. Note that I must use setDdnames() on IEBCOPY/IEBGENER to exploit the files created in previous MVSExec.

The process concerned is the Db2 pre-compilation. For our own reasons, we always have to go through the Db2 pre-compiler. If the COBOL program does not contain an SQL statement, it produces an empty DBRM. This empty DBRM must be discared from outputs.

Two solutions are possible:

  1. directly feed the DBRM's PDS, with output(true), then test if the DBRM is empty (size=0), and in this case delete it, also removing the DBRM from the list of outputs.

  2. store the DBRM in a temporary PDS, without output(true), then test if the DBRM is empty (size=0), and if it is not empty copy it into the DBRM's PDS with output(true).

Second solution adds a step in the construction report... for this reason the first solution seems the best to us, but you have to know how to remove the DBRM from the list of outputs, and currently I don't know how to do it. I will look at https://github.com/IBM/dbb-zappbuild/issues/279 and https://github.com/IBM/dbb-zappbuild/issues/275 .

dennis-behm commented 1 year ago

I don't understand why you need run the Db2 pre-compiler for all programs even without db2 statements. But ok.

If this is only about the reporting in the build report, why don't you use the DBB's file attributes isSQL ?

if (buildUtils.isSQL(logicalFile)) {
        compile.dd(new DDStatement().name("DBRMLIB").dsn("$props.cobol_dbrmPDS($member)").options('shr').output(true).deployType('DBRM')) 
}
else { // create the DBRMLIB allocation - if required by the db2-precompiler , but don't report it.
        compile.dd(new DDStatement().name("DBRMLIB").dsn("$props.cobol_dbrmPDS($member)").options('shr'))
}

But our discussion might be better fitted in the zAppbuild repo ...

FALLAI-Denis commented 1 year ago

I don't understand why you need run the Db2 pre-compiler for all programs even without db2 statements.

Because we can't know if a program is really Db2 or not... see next point.

why don't you use the DBB's file attributes isSQL ?

All our programs are (wrongly) considered to be Db2 because all our programs contain at least one EXEC SQL WHENEVER statement which is under conditional compilation control. image image Our programs are not written manually, they are built using tools. When a Db2 resource/access is added to the program, the AA-A-DB2 conditional variable becomes true and the Db2 handler routines are compiled. Otherwise they are not compiled. This may seem confusing but facilitates the evolution of programs: everything is already ready in the program and is automatically activated at compilation. The developer is relieved of purely technical tasks.

why this ?

We don't want to use the Db2 coprocessor of the COBOL compiler: different behavior on COBOL sentence endpoints and runtime performance issue on batches.

Currently, we do a COBOL pre-compilation with the MDECK(NOCOMPILE) option to expand the COPYBOOKs before doing the Db2 pre-compilation (we don't use EXEC SQL INCLUDE becase we need to do replacing on host-variables copybooks). It also handles conditional compilation and comments out lines excluded by conditional compilation. After this step we could determine if the program is really Db2.

It would be necessary to call the buildUtils.isSQL() function by passing the source code from SYSMDECK of COBOL pre-compilation. I look at this point. In this case the sequence would be: 1) COBOL pre-compilation 2) buildUtils.isSQL() parse on COBOL pre-compilation result 3) adaptation of the Db2 pre-compilation according to isSQL(), or bypass this step.

FALLAI-Denis commented 1 year ago

PS :

After analysis, the isSQL flag is determined during the execution of the scanner on the initial source file to create the logicalFile, at the same time as the determination of the dependencies... It would be necessary to restart the scanner on the SYSMDECK file (or a copy in USS) to exploit only the isSQL indicator... this may prove to be quite expensive in terms of CPU consumption and long in processing (2 successive scanner analyses).

dennis-behm commented 1 year ago

@FALLAI-Denis ,

No, a rescan is not required. DBB can overwrite the file attributes (isSQL, isCICS, ...) of the source file based on the attributes of the resolved dependencies (copybooks).

Please see, https://github.com/IBM/dbb-zappbuild/blob/zAppBuild_2_x/samples/application-conf/application.properties#L131-L137. It is a capability of the SearchPathDependencyResolver.

PS I think this discussion should better be moved into the zAppBuild repository.