Closed sunilganatra closed 4 years ago
Resource quota apis
>>> #Get instance resource quota >>> client.get_instance_resource_quota(SPARK_INSTANCE) '{"_id":"c69336e4248c4a87923274dea6166650","home_volume":{"type":"pvc","cos_hmac_keys":{},"name":"volumes-vol-inst1-pvc"},"api_key":"aef9d7b7-6220-440c-a251-f0bf6f9a28d7","state":"Created","namespace":"spark301","cpu_quota":20,"memory_quota":"20g","available_cpu_quota":18,"availalbe_memory_quota":"17g","creationDate":"Friday 06 November 2020 13:54:33.618+0000","updationDate":"Friday 06 November 2020 13:54:33.657+0000"}' >>> #Update Resource Quota >>> client.update_instance_resource_quota(SPARK_INSTANCE, cpu_quota=500, memory_quota='2000g') '' >>> client.get_instance_resource_quota(SPARK_INSTANCE) '{"_id":"c69336e4248c4a87923274dea6166650","home_volume":{"type":"pvc","cos_hmac_keys":{},"name":"volumes-vol-inst1-pvc"},"api_key":"aef9d7b7-6220-440c-a251-f0bf6f9a28d7","state":"Created","namespace":"spark301","cpu_quota":500,"memory_quota":"2000g","available_cpu_quota":498,"availalbe_memory_quota":"1997g","creationDate":"Friday 06 November 2020 13:54:33.618+0000","updationDate":"Friday 06 November 2020 13:54:33.657+0000"}' >>>
Upload & Submit spark jobs
>>> # Define spark job payload >>> payload={ ... "engine": { ... "type": "spark", ... "template_id": "spark-2.4.0-jaas-v2-cp4d-template", ... "conf": { ... "spark.app.nam": "myjob1" ... }, ... "size": { ... "num_workers": 15, ... "worker_size": { ... "cpu": 4, ... "memory": "20g" ... }, ... "driver_size": { ... "cpu": 5, ... "memory": "20g" ... } ... } ... }, ... "application_arguments": [], ... "main_class": "org.apache.spark.deploy.SparkSubmit" ... } >>> #Upload spark job & submit client.upload_and_submit_job(SPARK_INSTANCE,APP_VOLUME_INSTANCE,'/Users/sunilganatra/testSparkApp.py',params_json=payload) {'_messageCode_': 'Success', 'message': 'Successfully uploaded file and created the necessary directory structure'} '{"id":"60399c5b-b973-4ed6-9ae5-df00484c073f","job_state":"RUNNING"}'
Delete a job / all finished jobs / Delete all jobs
>>> client.get_all_jobs(SPARK_INSTANCE) '[{"id":"0994d1bb-7849-4f49-8655-805c4dfeef47","job_state":"FINISHED"},{"id":"60399c5b-b973-4ed6-9ae5-df00484c073f","job_state":"FINISHED"}]' >>> #Delete spark job >>> client.delete_spark_job(SPARK_INSTANCE,job_id='0994d1bb-7849-4f49-8655-805c4dfeef47') '' >>> client.get_all_jobs(SPARK_INSTANCE) '[{"id":"60399c5b-b973-4ed6-9ae5-df00484c073f","job_state":"FINISHED"}]' >>> client.get_all_jobs(SPARK_INSTANCE) '[{"id":"6b343509-9e48-4516-9e3e-4af8916af1d2","job_state":"FINISHED"},{"id":"798bd77b-ba96-419c-9690-a3f28edc160b","job_state":"FINISHED"},{"id":"7a75b56e-4f24-498a-977d-ba19c4ca7b4e","job_state":"FINISHED"},{"id":"9371573e-530d-4c7b-bd9a-b8c1541b0822","job_state":"FINISHED"},{"id":"ba9f934a-df25-4b54-8d81-4e35582275a8","job_state":"FINISHED"},{"id":"6b343509-9e48-4516-9e3e-4af8916af1d2","job_state":"RUNNING"}]' >>> client.delete_all_finished_spark_job(SPARK_INSTANCE) >>> client.get_all_jobs(SPARK_INSTANCE) '[{"id":"fc4f8b8f-1cd8-4e71-88a3-7305a80394b5","job_state":"RUNNING"}]' >>> client.delete_all_spark_job(SPARK_INSTANCE) >>> client.get_all_jobs(SPARK_INSTANCE) '[]'
@Dheeraj-Arremsetty Let me know if you want me to contribute to unit tests, for new apis
Resource quota apis
Upload & Submit spark jobs
Delete a job / all finished jobs / Delete all jobs