:exclamation: Looking for Mainframe Data Utilities v1? :exclamation: |
---|
See CONTRIBUTING for more information.
This project is licensed under the Apache-2.0 License.
Mainframe Data Utilities is an AWS Sample written in Python.
The purpose of this project is to provide Python scripts as a starting point for those who need to read EBCDIC files transferred from mainframes and AS/400 platforms on AWS or any distributed environment.
REDEFINES
statement for data items, it's only supported for group items.Download the code to run a preloaded example.
From Windows, Mac or Linux shell (including AWS CloudShell), clone this repo and change directory:
git clone https://github.com/aws-samples/mainframe-data-utilities.git mdu
cd mdu
There are some examples about how to extract data on different use cases:
Document | Description |
---|---|
Single Layout FB file | The simplest conversion. Local, 'fixed blocked' and 'single layout' input file. |
Read JSON metadata from Amazon S3 | The JSON metadata file read from S3. |
Single Layout FB file | Convert a file using multithreading and generating multiple output files. |
Single Layout VB file | Convert a Variable Block input file. |
Multiple Layout file | Convert a multiple layout input file. |
Read the input file from S3 | Get the input file from S3 and generate a local converted file. |
Write the output file on S3 | Read a local file and write a converted file on S3. |
Write the output data on DynamoDB | Read a local file and write its data on DynamoDB. |
Convert files using a Lambda function | Use a Lambda function to read an EBCDIC file from S3 and write the converted file back to S3. |
Convert files using S3 Object Lambda | Use an Object Lambda to convert a EBCDIC file while it's downloaded from S3. |
Split files by content/key | Split an EBCDIC file according with a key provided |
Discard specific layout | Convert a multiple layout input file while discarding selected record types |