databricks / sbt-databricks

An sbt plugin for deploying code to Databricks Cloud
http://go.databricks.com/register-for-dbc
Other
71 stars 27 forks source link

dbcDeploy problem : Stack trace suppressed #33

Closed adiautodesk closed 8 years ago

adiautodesk commented 8 years ago

I was trying to integrate intelliJ IDE with databricks using the link :

https://docs.cloud.databricks.com/docs/latest/databricks_guide/02%20Product%20Overview/12%20IDE%20Plugin.html

While following the tutorial, I am stuck at step 4, where I need to use dbcDeploy. While executing that command, I am getting the following error:

dbcDeploy [trace] Stack trace suppressed: run 'last :dbcFetchClusters' for the full output. [error](:dbcFetchClusters) com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') [error] at [Source: <!DOCTYPE html> [error] [error] [error] [error] [error] [error] Databricks - Sign In [error] [error] [error] [error] [error] [error] [error] [error] [error] [error]

[error] [error] [error] [error] ; line: 1, column: 2] [error] Total time: 2 s, completed Jun 13, 2016 9:24:10 PM

Can Anyone tell me what could be the issue here?

brkyvz commented 8 years ago

Could you please verify that your 'dbcApiUrl' ends with /api/1.2?

adiautodesk commented 8 years ago

How can i verify that ? I am currently using https://specificLink.cloud.databricks.com/api/1.2

Sorry, If thats a dumb question. I am new to databricks. I was looking at databricks REST api documentation and i got to know about 2 REST api releases i.e. 1.2 and 2.0. I have tried with both the links and I could not login. received the following error: {"error_code":"MALFORMED_REQUEST","message":"Request path is malformed"}

Do i need to setup REST api?

brkyvz commented 8 years ago

Could you please remove #notebook/118275/ from your URL and try again?

brkyvz commented 8 years ago

simply try https://<organization>.cloud.databricks.com/api/1.2

adiautodesk commented 8 years ago

sorry that was a typo..I tried without notebook as well. Let me try https://organization.cloud.databricks.com/api/1.2 .

adiautodesk commented 8 years ago

That site can't be reached.

brkyvz commented 8 years ago

could you please share the snippet of your Build file that sets the plugin keys?

adiautodesk commented 8 years ago

My plugins.sbt file is as follows:

logLevel := Level.Warn addSbtPlugin("com.databricks" %% "sbt-databricks" % "0.1.5")

build.sbt file is as follows:

name := "sparkDemo"

version := "1.0"

scalaVersion := "2.11.8"

// Your username to login to Databricks dbcUsername := "myUserNameForDatabricks"// e.g. "admin"

// Your password (Can be set as an environment variable) dbcPassword := "myPassword"// e.g. "admin" or System.getenv("DBCLOUD_PASSWORD") // Gotcha: Setting environment variables in IDE's may differ. IDE's usually don't pick up environment variables from .bash_profile or .bashrc

// The URL to the Databricks REST API. dbcApiUrl := "http://myOrganization.cloud.databricks.com/api/2.0"// "https://organization.cloud.databricks.com/api/1.2"

// Add any clusters that you would like to deploy your work to. e.g. "My Cluster" dbcClusters += "myClusterName"// Add "ALL_CLUSTERS" if you want to attach your work to all clusters

// An optional parameter to set the location to upload your libraries to in the workspace e.g. "/home/USER/libraries" // This location must be an existing path. // NOTE: Specifying this parameter is strongly recommended as many jars will be uploaded to your cluster. // Putting them in one folder will make it easy for your to delete all the libraries at once. dbcLibraryPath := "/home/USER/libraries" // Default is "/"

// Whether to restart the clusters everytime a new version is uploaded to Databricks. dbcRestartOnAttach := true // Default true.

//resolvers := "http://central.maven.org/maven2/com/databricks/spark-csv_2.10/0.1/spark-csv_2.10-0.1.pom

brkyvz commented 8 years ago

Just to be sure that it's not a typo: Could you please change the dbcApiUrl to be https and point to api/1.2 instead of api/2.0?

adiautodesk commented 8 years ago

Yeah i tried that. No success. tried the following: dbcApiUrl := "https://myOrganizationName.cloud.databricks.com/api/1.2"

bllchmbrs commented 8 years ago

Have you seen this template? https://github.com/anabranch/Apache-Spark-Scala-Project-Template/blob/master/build.sbt

This might help you make sure there's no user error. All you need to do is correctly set

I wrote that to make it easy for me to create my own Scala Spark Projects.

brkyvz commented 8 years ago

And you normally use Databricks through "https://myOrganizationName.cloud.databricks.com" correct? Could you please reach out to support@databricks.com? Do you happen to have SSO enabled in your organization?

adiautodesk commented 8 years ago

I dont actually use the "myOrganizationName" in the url. I do have a separate url. I have used both the combinations though. Let me check if SSO is enabled or not.

bllchmbrs commented 8 years ago

Please try the template that I included as it's a simple example to make sure that it's not something on your end. if that doesn't work please follow up with support@databricks and we can help you that way :)

brkyvz commented 8 years ago

You will have to use the URL that you login to Databricks with + /api/1.2.

adiautodesk commented 8 years ago

Burak, I tried that. Bill, you mean the tempalate in https://docs.cloud.databricks.com/docs/latest/databricks_guide/02%20Product%20Overview/12%20IDE%20Plugin.html ?? I have verified and using the same template.

bllchmbrs commented 8 years ago

@adiautodesk The one here: https://github.com/anabranch/Apache-Spark-Scala-Project-Template

I have used this and can confirm that it works exactly as is. You just need to swap out:

Environment variables for username and pass Change the URL to have your organization name instead of the filler in there now.

and I'm asking you to please perform that without using the IDE - this will confirm there isn't something different set up with your organization.

Then try running it within the IDE.

adiautodesk commented 8 years ago

Bill, Let me try that. Thanks!!

adiautodesk commented 8 years ago

@brkyvz : we don't have SSO with databricks. can that be an issue?

bllchmbrs commented 8 years ago

@adiautodesk No, not having SSO has no effect on the REST API, so not having SSO is not the issue.

bllchmbrs commented 8 years ago

We haven't heard anything from you so I'm assuming this was able to help you fix the issue!