Closed adiautodesk closed 8 years ago
Could you please verify that your 'dbcApiUrl' ends with /api/1.2?
How can i verify that ? I am currently using https://specificLink.cloud.databricks.com/api/1.2
Sorry, If thats a dumb question. I am new to databricks. I was looking at databricks REST api documentation and i got to know about 2 REST api releases i.e. 1.2 and 2.0. I have tried with both the links and I could not login. received the following error: {"error_code":"MALFORMED_REQUEST","message":"Request path is malformed"}
Do i need to setup REST api?
Could you please remove #notebook/118275/
from your URL and try again?
simply try https://<organization>.cloud.databricks.com/api/1.2
sorry that was a typo..I tried without notebook as well. Let me try https://organization.cloud.databricks.com/api/1.2 .
That site can't be reached.
could you please share the snippet of your Build file that sets the plugin keys?
My plugins.sbt file is as follows:
logLevel := Level.Warn addSbtPlugin("com.databricks" %% "sbt-databricks" % "0.1.5")
build.sbt file is as follows:
name := "sparkDemo"
version := "1.0"
scalaVersion := "2.11.8"
// Your username to login to Databricks dbcUsername := "myUserNameForDatabricks"// e.g. "admin"
// Your password (Can be set as an environment variable) dbcPassword := "myPassword"// e.g. "admin" or System.getenv("DBCLOUD_PASSWORD") // Gotcha: Setting environment variables in IDE's may differ. IDE's usually don't pick up environment variables from .bash_profile or .bashrc
// The URL to the Databricks REST API. dbcApiUrl := "http://myOrganization.cloud.databricks.com/api/2.0"// "https://organization.cloud.databricks.com/api/1.2"
// Add any clusters that you would like to deploy your work to. e.g. "My Cluster" dbcClusters += "myClusterName"// Add "ALL_CLUSTERS" if you want to attach your work to all clusters
// An optional parameter to set the location to upload your libraries to in the workspace e.g. "/home/USER/libraries" // This location must be an existing path. // NOTE: Specifying this parameter is strongly recommended as many jars will be uploaded to your cluster. // Putting them in one folder will make it easy for your to delete all the libraries at once. dbcLibraryPath := "/home/USER/libraries" // Default is "/"
// Whether to restart the clusters everytime a new version is uploaded to Databricks. dbcRestartOnAttach := true // Default true.
//resolvers := "http://central.maven.org/maven2/com/databricks/spark-csv_2.10/0.1/spark-csv_2.10-0.1.pom”
Just to be sure that it's not a typo:
Could you please change the dbcApiUrl
to be https
and point to api/1.2
instead of api/2.0
?
Yeah i tried that. No success. tried the following: dbcApiUrl := "https://myOrganizationName.cloud.databricks.com/api/1.2"
Have you seen this template? https://github.com/anabranch/Apache-Spark-Scala-Project-Template/blob/master/build.sbt
This might help you make sure there's no user error. All you need to do is correctly set
I wrote that to make it easy for me to create my own Scala Spark Projects.
And you normally use Databricks through "https://myOrganizationName.cloud.databricks.com" correct?
Could you please reach out to support@databricks.com
? Do you happen to have SSO enabled in your organization?
I dont actually use the "myOrganizationName" in the url. I do have a separate url. I have used both the combinations though. Let me check if SSO is enabled or not.
Please try the template that I included as it's a simple example to make sure that it's not something on your end. if that doesn't work please follow up with support@databricks and we can help you that way :)
You will have to use the URL that you login to Databricks with + /api/1.2
.
Burak, I tried that. Bill, you mean the tempalate in https://docs.cloud.databricks.com/docs/latest/databricks_guide/02%20Product%20Overview/12%20IDE%20Plugin.html ?? I have verified and using the same template.
@adiautodesk The one here: https://github.com/anabranch/Apache-Spark-Scala-Project-Template
I have used this and can confirm that it works exactly as is. You just need to swap out:
Environment variables for username and pass Change the URL to have your organization name instead of the filler in there now.
and I'm asking you to please perform that without using the IDE - this will confirm there isn't something different set up with your organization.
Then try running it within the IDE.
Bill, Let me try that. Thanks!!
@brkyvz : we don't have SSO with databricks. can that be an issue?
@adiautodesk No, not having SSO has no effect on the REST API, so not having SSO is not the issue.
We haven't heard anything from you so I'm assuming this was able to help you fix the issue!
I was trying to integrate intelliJ IDE with databricks using the link :
https://docs.cloud.databricks.com/docs/latest/databricks_guide/02%20Product%20Overview/12%20IDE%20Plugin.html
While following the tutorial, I am stuck at step 4, where I need to use dbcDeploy. While executing that command, I am getting the following error:
Can Anyone tell me what could be the issue here?