Open SongGithub opened 5 years ago
Yes, we should add this information to the README:
There are obviously some downsides:
So for me it's mostly about having a lightweight library to interact with the Kubernetes API when not needing a full-blown client with many dependencies. So it's near perfect for my use cases (kube-janitor, kube-downscaler, kube-ops-view, kube-resource-report), but maybe not the right choice for others.
@SongGithub as pykube-ng is opinionated, I also want to discuss the future pykube-ng API interface, see https://github.com/hjacobs/pykube/issues/13
thanks for your reply! I think if we value portability among infrastructures, not mandating custom authentication mechanisms
is actually a benefit.
i really like that this client uses python dictionaries to specify resources. this means you can keep your resources in standard-looking yaml
files and apply them by invoking kubectl
, but also programmatically apply them without resorting to shelling out to kubectl
. using the official API client, converting between a dictionary representation and the expected object representation is a PITA.
FYI: a colleague of mine created a high-level Python Operator Framework ("Kopf") which makes it really easy to create operators for CRDs with just a few lines of Python: https://github.com/zalando-incubator/kopf
@hjacobs a bit off topic, sorry for hijacking thread.
I am trying to interact with a custom operator and not sure how to do it with pykube. The last comment you left leads me to believe it should be possible but im not sure I can find the documentation for it. ThirdPartyResource
doesnt seem to be it.
For completeness I am trying to interact with https://github.com/GoogleCloudPlatform/spark-on-k8s-operator. If anyone could point in the right direction. That would be great
@ekhaydarov what do you want to do? Do you want to modify a CRD resource?
You can always create a custom class for custom resources and use it, e.g.:
class SparkApplication(NamespacedAPIObject):
version = "sparkoperator.k8s.io/v1beta1"
endpoint = "sparkapplications"
kind = "SparkApplication"
api = pykube.HTTPClient(pykube.KubeConfig.from_env())
for sparkapp in SparkApplication.objects(api):
print(sparkapp.name, sparkapp.labels)
This would create a class for the CRD referenced here: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/manifest/spark-operator-crds.yaml#L18
there is an official lib
And when choosing a Python lib for a k8s project, what is pro/con of choosing this lib? I guess you and the original author would know this fact best.