Closed HanlinMiao closed 1 week ago
unittest failure is #6509. I'm going to pick that one up and look into it.
Can you compare the resulting JobResult from a k8s scheduled job versus a Celery scheduled job to make sure they're both fully/correctly populated?
Here is a comparison of the fields populated. Kubernetes
is on the left and Celery
is on the right. The main differences lie within the celery_kwargs
field. I wonder if it is acceptable since we are not using celery workers to execute the job anyway so celery_kwargs
should not be needed in the Kubernetes case.
Regarding the difference in celery_kwargs
, do the nautobot_job_profile
and time-limit parameters get set appropriately for the executing job in some other fashion?
Can we fix up the difference in name
(Job.name versus Job.class_path)?
Regarding the difference in
celery_kwargs
, do thenautobot_job_profile
and time-limit parameters get set appropriately for the executing job in some other fashion?Can we fix up the difference in
name
(Job.name versus Job.class_path)?
I think executing scheduled jobs with celery[on the right], nautobot_job_profile
, soft_time_limit
and etc. are not set because we are not calling enqueue_job
. Whereas executing scheduled jobs with kubernetes[on the left], we are callings enqueue_job so those parameters get set properly.
This commit sets the job result name
and task_name
to the job's class path.
It seems like the task_name
is the job class path and the name
attribute is the actual job name. I will make the changes accordingly.
{ '_state': <django.db.models.base.ModelState object at 0xffff947682d0>,
'id': UUID('7e8eeeec-b651-4bd7-8152-453206059395'),
'_custom_field_data': {},
'job_model_id': UUID('23f7041c-1448-45f4-b70c-a8d1800d55cb'),
'name': 'Export Object List',
'task_name': 'nautobot.core.jobs.ExportObjectList',
'date_created': datetime.datetime(2024, 11, 18, 17, 20, 14, 93357, tzinfo=datetime.timezone.utc),
'date_done': datetime.datetime(2024, 11, 18, 17, 20, 14, 283705, tzinfo=datetime.timezone.utc),
'user_id': UUID('daf75699-07ea-4a1b-880b-f4e2b9ffb44f'),
'status': 'SUCCESS',
'result': None,
'worker': 'celery@ecc7e93cae69',
'task_args': [],
'task_kwargs': { 'content_type': 1,
'query_string': '',
'export_format': 'csv',
'export_template': None},
'celery_kwargs': { 'queue': 'default',
'nautobot_job_profile': False,
'nautobot_job_user_id': 'daf75699-07ea-4a1b-880b-f4e2b9ffb44f',
'nautobot_job_job_model_id': '23f7041c-1448-45f4-b70c-a8d1800d55cb'},
'traceback': None,
'meta': {'children': []},
'scheduled_job_id': None,
'use_job_logs_db': True
}
Closes #6323
What's Changed
NautobotDataBaseScheduler.apply_async
method to distinguish between Celery and Kubernetes Jobscelery-beat-deployment.yaml
docker-compose.min.yaml
in case that re-generations of Kubernetes deployment files are needed in the futureScreenshots
Run
kubectl apply -f development/kubernetes/celery-beat-deployment.yaml
See the Kubernetes pod setup
Run
kubectl exec --stdin --tty nautobot-679bdc765-snfps -- /bin/bash
and thennautobot-server nbshell
to create a Scheduled Job innbshell
See that a Kubernetes Job and Kubernetes Job Pod is created
See that a new Job Result with the status
SUCCESS
is created.TODO