Closed maximauro closed 4 years ago
@maximemauro thanks for reporting this, tagging the right team to have a look at this
Here is my Pipfile:
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
auto-changelog = "*"
coverage = "*"
flake8 = "*"
pylint = "*"
pytest = "*"
pytest-cov = "*"
python-githooks = "*"
python-semantic-release = "*"
flask = "*"
[packages]
azure-common = "*"
azure-mgmt = "*"
msrest = "*"
requests = "*"
cffi = "*"
envsubst = "*"
[requires]
python_version = "3.7"
Latest azure-mgmt version (v 4.0.0) includes azure-mgmt-datafactory v0.6.0, which is not the latest. Latest version of azure-mgmt-datafactory is v0.8.0, but releases notes do not mention that DatabricksSparkPython activity has been implemented...
Upgrading the azure-mgmt-datafactory to v0.8.0 solved the problem. But I had to get rid of the azure-mgmt package. Closing the issue ;)
Thanks for working with Microsoft on GitHub! Tell us how you feel about your experience using the reactions on this comment.
Hello,
When I try to create a new pipeline in DataFactory with a DatabricksSparkPython activity, I get the following error:
Here is my pipeline configuration:
{ "name": "Pipeline Test", "type": "Microsoft.DataFactory/factories/pipelines", "properties": { "activities": [ { "name": "Python Hello Test", "description": "test", "type": "DatabricksSparkPython", "typeProperties": { "pythonFile": "dbfs:/jobs/test/hello.py" }, "linkedServiceName": { "referenceName": "lsName", "type": "LinkedServiceReference" } } ] } }
Could someone help me with a solution?
Thank you!