microsoft / azdo-databricks

A set of Build and Release tasks for Building, Deploying and Testing Databricks notebooks
MIT License
25 stars 30 forks source link

Error: AttributeError: type object 'Retry' has no attribute 'DEFAULT_METHOD_WHITELIST' #59

Open yashwantpawar opened 1 year ago

yashwantpawar commented 1 year ago

We have following task which is failing with below error: Tasks:

===== Configure Databricks CLI =====

          - task: configuredatabricks@0
            displayName: Configure Databricks CLI
            inputs:
              url: $(databricks_url)
              token: $(sm-dbw-access-token)
          #===== Deploy Notebooks to Workspace =====
          - task: deploynotebooks@0
            displayName: Deploy Notebooks to Workspace
            inputs:
              notebooksFolderPath: $(Pipeline.Workspace)/drop/notebook
              workspaceFolder: /NOTEBOOKS/etl1

error: Starting: Deploy Notebooks to Workspace

Task : Deploy Databricks Notebooks Description : Recursively deploys Notebooks from given folder to a Databricks Workspace Version : 0.5.6 Author : Microsoft DevLabs Help :

/opt/hostedtoolcache/Python/3.10.6/x64/bin/python -V Python 3.10.6 Version: 3.10.6

Python3 selected. Running... /opt/hostedtoolcache/Python/3.10.6/x64/bin/databricks workspace import_dir -o --profile AZDO /home/vsts/work/1/drop/notebook /NOTEBOOKS/etl1 Error: AttributeError: type object 'Retry' has no attribute 'DEFAULT_METHOD_WHITELIST'

[error]The Notebooks import process failed.

Finishing: Deploy Notebooks to Workspace

We used following workaround to mitigate issue. https://github.com/databricks/databricks-cli/issues/634

We also think issue could also be caused by https://github.com/urllib3/urllib3/pull/2086/commits/ba593472e6b7230038f73560dde5321e72471246

Can you please confirm if this is causing error and provide us solution to avoid that workaround. So we can have permanent fix for the issue.

ashishp-blueshift commented 1 year ago

I have the exact same issue.

I cross checked a current build with one from March.

The build that failed last night showed this:

... Collecting requests>=2.17.3 (from databricks-cli) Downloading requests-2.30.0-py3-none-any.whl (62 kB) ... Collecting urllib3<3,>=1.21.1 (from requests>=2.17.3->databricks-cli) Downloading urllib3-2.0.2-py3-none-any.whl (123 kB) ...

Where as the one from March showed ... Collecting requests>=2.17.3 Downloading requests-2.28.2-py3-none-any.whl ... Collecting urllib3<1.27,>=1.21.1 Downloading urllib3-1.26.15-py2.py3-none-any.whl

I believe we want this combination, which is what my original environment.yml states and is referenced in the links below pip install requests==2.25.1 urllib3==1.26.5

Additional links: https://community.databricks.com/s/topic/0TO8Y000000mOi8WAE/type-object https://stackoverflow.com/questions/76183443/azure-devops-release-pipeline-attributeerror-type-object-retry-has-no-attribu https://github.com/marcospereirampj/python-keycloak/issues/196

[UPDATE]. I updated my YML with this and I can confirm it worked. Note for me urllib3==1.25.11 doesn't cause error but I read that 1.26.5 will work too.

    - task: CmdLine@2
      displayName: 'Databricks dependencies'
      inputs:
        script: |
          pip install requests==2.25.1 urllib3==1.25.11

    - task: riserrad.azdo-databricks.azdo-databricks-configuredatabricks.configuredatabricks@0
      displayName: 'Configure Databricks CLI'
      inputs:
        url: $(dbwks-etl-url)
        token: $(dbwks-etl-access-token)