Teradata / jupyter-demos

16 stars 19 forks source link

Setup standard scripts on gss1185 for add/replacing DDL & cloud tables #452

Open DougEbel opened 10 months ago

DougEbel commented 10 months ago

We have undocumented processes using scripts on gss1183. We need to create directory /data/demonow/cloud_bucket that has all of the authorizations needed in /data/demonow/monitor.profile with scripts:

There should be a ddl directory where files named matching the name to be posted are stored.

The movement of data to the cloud will need parameters:

You could either create a script that will run bteq on gss1185 to connect to your machine and execute write_nos query or use the BTEQ script to install a stored procedure into your machine that you can use for moving tables. (This assumes that you will be creating new machines from time-to-time and that we don't want anything permanent on the environments for doing the movement..

DallasBowden commented 9 months ago

Scripts moved from gss1183 to 1185 and tested. DDL under /data/demonow. Upload_ddl.sh. Data upload. Shilpa created a macro and proc for using Read/Write Nos to move the data.

DallasBowden commented 9 months ago

DDL upload works correctly. Still need to test push data to cloud. Doug has added a help feature to help along the process.

DallasBowden commented 8 months ago

CSAE Maintenance/DDL and Data Uploading to GCS I've copied the document that contains the steps and the scripts for uploading the DDL and Data into GCS for notebooks. I've also added a document with the details that were provided in an email thread with Doug.

Please review it and make any necessary changes/edits. https://teradata.sharepoint.com/:f:/r/teams/ClearScapeDemo/Shared%20Documents/CSAE%20Maintenance/DDL%20and%20Data%20Uploading%20to%20GCS?csf=1&web=1&e=6c4TDV