Open tonyyang-svail opened 5 years ago
Should we implement something like?
SHOW JOBS;
|---|----------------|
| 0 | TRAIN iris.train |
KILL 0;
Good idea, @typhoonzero , also it would be better to add Job status, Job scheduler, Job type, running time, etc...
Another discussion point, should we expect user to start long running jobs in SQLFlow as it evolves. I am sure in the AntFin's case, there will be some.
I don't feel this is an urgent feature request.
Also, the design of SQLFlow keeps the flexibility to delegate submitter programs to decide how and where to submit jobs. Suppose that a SQLFlow deployment is configured to generate submitters that submit jobs to Kubernetes, the user should be able to watch her jobs using Kubernetes dashboard.
On Mon, May 13, 2019, 10:59 PM Haojie Hang notifications@github.com wrote:
Good idea, @typhoonzero https://github.com/typhoonzero , also it would be better to add Job status, Job scheduler, Job type, running time, etc...
Another discussion point, should we expect user to start long running jobs in SQLFlow as it evolves. I am sure in the AntFin's case, there will be some.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/sql-machine-learning/sqlflow/issues/159?email_source=notifications&email_token=AAL2DZ4ICCBNYD2JD2XD7LDPVF66NA5CNFSM4GPB5G4KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODVISWSY#issuecomment-491858763, or mute the thread https://github.com/notifications/unsubscribe-auth/AAL2DZ5DMXDGNLP7443UQJDPVF66NANCNFSM4GPB5G4A .
I would like to see that after submitting a job in Jupyter, it prints a URL to the Kubernetes dashboard.
I would like to see that after submitting a job in Jupyter, it prints a URL to the Kubernetes dashboard.
In a single node setup, should SQLFlow return something like a process id? (If it's a long running job)
Need to provide a way to kill/stop a long-running query/ML job.