databricks / cli

Databricks CLI
Other
148 stars 56 forks source link

How to use python wheel with DAB and serverless compute ? #1797

Closed RaccoonForever closed 1 month ago

RaccoonForever commented 1 month ago

Describe the issue

I can't find a way to deploy my python wheel on a serverless compute. I tried with the dependencies in environments but no success.

I looked at the documentation and this example: https://github.com/databricks/bundle-examples/tree/main/knowledge_base/serverless_job It seems to be only for package that are in a repository.

My package is build with DAB with the following setup.py:

image

Am I missing something ?

Thanks for the great work by the way !

OS and CLI version

Linux and CLI 0.228.1

pietern commented 1 month ago

Did you try specifying the path to the built wheel file in the dependencies section?

To make sure I just made it work with the following configuration (in the default template):

      tasks:
        - task_key: task
          environment_key: default
          spark_python_task:
            python_file: ../src/main.py

      environments:
        - environment_key: default
          spec:
            client: "1"
            dependencies:
              - ../dist/*.whl
RaccoonForever commented 1 month ago

I'll try this and keep you in touch !

It worked with the following:

          tasks:
            - task_key: X
              python_wheel_task:
                package_name: X
                entry_point: X
                named_parameters: {
                  "env": "dev"
                }
              environment_key: Default
          environments:
            - environment_key: Default
              spec:
                client: "1"
                dependencies:
                  - dist/*.whl

Thanks :) !