Open typhoonzero opened 4 years ago
It seems that we also need a pkg/submitter
package:
`-pkg/submiter
|-python.go # cmd: python xxx.py
|-pai.go # cmd: pai -Djobname=sqlflow_job ...
|-alisa.go # goalisa: alisa.createTask('pai -Djobname=sqlflow_job ...')
|- TODO: alps/elasticdl ....
@Yancey1989 I recommend put all current "submitter" to pkg/step
and rename the interface to Executor
, so we can call the executors like step.Executor.ExecuteTrain(...)
etc. It's more meaningful.
c.f. https://github.com/sql-machine-learning/sqlflow/pull/1553#issuecomment-569858480
Currently,
sql
package contains almost all core code for parsing, generating python code and executing. We need to put those features in a separated package structure for better code understanding:parser
package is already moved underpkg
folderfeature_derivation
package is already moved underpkg
folderpipe
,verifier
are already moved underpkg
folderir
to pkg foldertestdata
to pkg folderWe currently have two job execution mode: workflow mode and run in local mode.
pkg/argo
to submit it and monitor the job's status. Each step in the workflow is arepl
command.We may need to use a command
step
instead ofrepl
to be more meaningful. For that w'll havecmd/step
callspkg/step
to run a step or, in the future generate a step Python codepkg/step
contains:step
to run a single SQL statement)pkg/step/codegen
generate step python codepkg/workflow
contains:pkg/workflow/codegen
generate Couler/Fluid python codepkg/workflow/argo
submit, get status, get logs for argopkg/workflow/tekton
submit, get status, get logs for tekton