jupyterhub / zero-to-jupyterhub-k8s

Helm Chart & Documentation for deploying JupyterHub on Kubernetes
https://zero-to-jupyterhub.readthedocs.io
Other
1.52k stars 791 forks source link

Not able to spawn specific notebooks using API #1552

Closed ankitladhania closed 4 years ago

ankitladhania commented 4 years ago

Hi Guys, We are using Jupyterhub API to spawn notebooks, the request goes through but it is spawning the default image everytime. We have deployed the infra using Zero to JupyterHub.

PFB the config file and API that we are using:

proxy:
  secretToken: SSSSSSSSSSSSSSSSSSS
  service:
    type: NodePort
    nodePorts:
      http: 30030
      https: 30031

debug:
  enabled: true

auth:
    type: "custom"
    custom:
        className: "jhub_remote_user_authenticator.remote_user_auth.RemoteUserAuthenticator"
    admin:
      users:
        - admin
hub:
  image:
    name: a1996kash/k8s-hub
    tag: 1.0.4
  extraConfig:
    myConfig.py: |
      #c.KubeSpawner.extra_resource_guarantees = {"nvidia.com/gpu": "1"}
      #c.KubeSpawner.extra_resource_limits = {"nvidia.com/gpu": "1"}
      #c.JupyterHub.allow_named_servers = True
      c.JupyterHub.log_level = 'DEBUG'
      c.Spawner.debug = True
      c.KubeSpawner.debug = True
      c.LocalProcessSpawner.debug = True
      c.Spawner.default_url = '/lab'
      c.JupyterHub.base_url = u'/jhub'
      c.JupyterHub.logo_file = u'/usr/local/share/jupyter/hub/static/images/logo.png'
      c.JupyterHub.services.append({'name': 'cull-idle','admin': True,'command': ['/usr/local/bin/cull_idle_servers.py','cull_idle_servers.py','--cull-every=60','--timeout=3600','--url=http://127.0.0.1:8081/jhub/hub/api']})
  extraEnv:
   AUTH_KEY: "SSSSSSS"
   REDIRECT_URL: "https://computational-beta.zeblok.com/?SSSSS"

  service:
    type: NodePort
    ports:
      nodePort: 30032
  db:
    type: sqlite-pvc
    pvc:
      accessModes:
        - ReadWriteOnce
      storage: 1G
      storageClassName: csi-cephfs

singleuser:
  imagePullSecret:
    enabled: true
    username: SSSSSS
    email: SSSSSSSS
    password: SSSSSSS
  cpu:
    limit: 1.0
    guarantee: 1.0
  memory:
    limit: 8G
    guarantee: 4G
  extraEnv:
    DEVDEBUG: "yespls"

  storage:
    capacity: 50G
    type: dynamic
    dynamic:
      storageClass: csi-cephfs
  lifecycleHooks:
          postStart:
            exec:
              command: ["gitpuller", "https://github.com/A1996KASH/zeblokNotebooks", "master", "zeblokNotebooks"]

  # Defines the default image
  image:
    name: a1996kash/minimal-notebook-gpu
    tag: latest
  profileList:
    - display_name: "Base Notebook"
      description: "Minimally-functional Jupyter Notebook server"
      default: true
      extra_resource_guarantees:
        nvidia.com/gpu: "1"
      extra_resource_limits:
        nvidia.com/gpu: "1"
    - display_name: "Nvidia Rapids"
      description: "The RAPIDS data science framework includes a collection of libraries for executing end-to-end data science pipelines completely in the GPU. It is designed to have a familiar look and feel to data scientists working in Python"
      default: False
      kubespawner_override:
        image: a1996kash/nvidia-rapids:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"
        lifecycle_hooks:
          postStart:
            exec:
              command: ["gitpuller", "https://github.com/rapidsai/notebooks", "master", "notebooks"]
    - display_name: "PySpark Notebook"
      description: "jupyter/pyspark-notebook includes Python support for Apache Spark, optionally on Mesos. Everything in jupyter/scipy-notebook and its ancestor images. Apache Spark with Hadoop binaries Mesos client libraries"
      default: False
      kubespawner_override:
        image: a1996kash/pyspark-notebook-gpu:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"

    - display_name: "All-Spark Notebook"
      description: "jupyter/pyspark-notebook includes Python support for Apache Spark, optionally on Mesos. Everything in jupyter/pyspark-notebook and its ancestor images. IRKernel to support R code in Jupyter notebooks Apache Toree and spylon-kernel to support Scala code in Jupyter notebooks ggplot2, sparklyr, and rcurl packages"
      default: False
      kubespawner_override:
        image: a1996kash/all-spark-notebook-gpu:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"

    - display_name: "DataScience Notebook"
      description: "jupyter/datascience-notebook includes libraries for data analysis from the Julia, Python, and R communities. Everything in the jupyter/scipy-notebook and jupyter/r-notebook images, and their ancestor images The Julia compiler and base environment IJulia to support Julia code in Jupyter notebooks HDF5, Gadfly, and RDatasets packages"
      default: False
      kubespawner_override:
        image: a1996kash/datascience-notebook-gpu:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"

    - display_name: "R notebook"
      description: "jupyter/r-notebook includes popular packages from the R ecosystem. Everything in jupyter/minimal-notebook and its ancestor images The R interpreter and base environment IRKernel to support R code in Jupyter notebooks tidyverse packages, including ggplot2, dplyr, tidyr, readr, purrr, tibble, stringr, lubridate, and broom from conda-forge plyr, devtools, shiny, rmarkdown, forecast, rsqlite, reshape2, nycflights13, caret, rcurl, and randomforest packages from conda-forge"
      default: False
      kubespawner_override:
        image: a1996kash/r-notebook-gpu:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"

    - display_name: "Tensorflow Notebook"
      description: "jupyter/tensorflow-notebook includes popular Python deep learning libraries. Everything in jupyter/scipy-notebook and its ancestor images. tensorflow and keras machine learning libraries"
      default: False
      kubespawner_override:
        image: a1996kash/tensorflow-notebook-gpu
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"
    - display_name: "Scipy Notebook 2GPU"
      description: "jupyter/scipy-notebook includes popular packages from the scientific Python ecosystem. Everything in jupyter/minimal-notebook and its ancestor images pandas, numexpr, matplotlib, scipy, seaborn, scikit-learn, scikit-image, sympy, cython, patsy, statsmodel, cloudpickle, dill, numba, bokeh, sqlalchemy, hdf5, vincent, beautifulsoup, protobuf, and xlrd packages ipywidgets for interactive visualizations in Python notebooks Facets for visualizing machine learning datasets"
      default: False
      kubespawner_override:
        image: a1996kash/scipy-notebook-gpu:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "2"
        extra_resource_limits:
          nvidia.com/gpu: "2"
    - display_name: "Scipy Notebook"
      description: "jupyter/scipy-notebook includes popular packages from the scientific Python ecosystem. Everything in jupyter/minimal-notebook and its ancestor images pandas, numexpr, matplotlib, scipy, seaborn, scikit-learn, scikit-image, sympy, cython, patsy, statsmodel, cloudpickle, dill, numba, bokeh, sqlalchemy, hdf5, vincent, beautifulsoup, protobuf, and xlrd packages ipywidgets for interactive visualizations in Python notebooks Facets for visualizing machine learning datasets"
      default: False
      kubespawner_override:
        image: a1996kash/scipy-notebook-gpu:latest
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"
    - display_name: "DataScience Notebook - No GPU"
      description: "jupyter/datascience-notebook includes libraries for data analysis from the Julia, Python, and R communities. Everything in the jupyter/scipy-notebook and jupyter/r-notebook images, and their ancestor images The Julia compiler and base environment IJulia to support Julia code in Jupyter notebooks HDF5, Gadfly, and RDatasets packages"
      default: False
      kubespawner_override:
        image: a1996kash/datascience-no-gpu:latest

    - display_name: "Explainable AI notebook"
      default: False
      kubespawner_override:
        image: XXXX/data_context_map:latest
        cpu_limit: 2.0
        cpu_cpu_guarantee: 2.0
        lifecycle_hooks:
          postStart:
            exec:
              command: ["gitpuller", "XXXX"]
        extra_resource_guarantees:
          nvidia.com/gpu: "1"
        extra_resource_limits:
          nvidia.com/gpu: "1"

API:

URL: https://{{HOST}}/users/ankitladhania/server
BODY:
{ 
    "options":{
            "profile": "Explainable AI notebook"
    }

}

Also tried (without success):

URL: https://{{HOST}}/users/ankitladhania/server
BODY:
{ 
    "options":{
            "profile": 5
    }

}

Jhub version: 0.8.2

The Api is spawning the default image everytime, irrespective of the profile.

minrk commented 4 years ago

Discussed on gitter, but the issues here:

  1. in KubeSpawner 0.11 and above, a "profile" field is supported (the profile display_name). This is the only supported key and should be documented with an example API call in docs.
  2. the user_options dict is the top-level body of the request, not under an 'options' field, so the body should be {"profile": "my profile name"}.
  3. z2jh 0.8.2 deploys kubespawner 0.10, which lacks user_options support

While we can add an API-spawning example here, I think it may perhaps be more appropriate directly in the kubespawner docs.