tapis-project / smart-scheduling

Dynamically optimize job scheduling
1 stars 0 forks source link

To-Do List #1

Open Costaki33 opened 2 years ago

Costaki33 commented 2 years ago
  1. [x] isConnected()

  2. [x] Need to implement complete list of indexes

  3. [x] Create cursors on the fly, pass in the connection, create cursor, close cursor in the respective function

    • Play it safe
  4. [x] Injection

    • Perform all the correct conversion and input into the table
    • Dates to be datetimes (UTC), timelimit to be minutes (int -> look into regex), nnodes (int), jobid(str)
    • Consider what type for nodelist in database
      • Search for how to represent list in mysql
  5. [ ] Injection Improvements

    • Create a last read table
      • Insert the last read in file name
      • First time on creation, insert a date that is earlier than all the dates in the sets
      • Commit after read in last file
    • Ignore any files before that specified date
  6. [ ] On the command line

    • Pass in all parameters in the command line without prompting
      • Security for the user log in information
        • Password file with proper permissions
Costaki33 commented 2 years ago

Notes:

UPDATE:

Costaki33 commented 2 years ago

Add Debug statements to see how long the data is running

Costaki33 commented 2 years ago

Point to a directory and load into a database all the data from the database all the files it hasn't run yet. Admins -> We move the new files to a new directory and read those in. No archive. Grab the new ones, read in, and delete.

Christian's idea: List of all the names of the HPCS and directory path, SSH client into each hpc through a loop, grab file or scp via list based on position in comparison to the position of the last read in file

Initial load:

For each update, above

Contact Virginia