GoogleCloudPlatform / cloud-sql-jdbc-socket-factory

A collection of Java libraries for connecting securely to Cloud SQL
Apache License 2.0
234 stars 119 forks source link

Brief summary of what bug or error was observed #2039

Closed saikrishnab27 closed 4 months ago

saikrishnab27 commented 4 months ago

Bug Description

Not able to connect to CloudSQL postgres instance using IAM based authentication

Example code (or command)

//1.We're testing it in the local by creating the spark-shell with following dependencies

//spark-shell --jars gs://pgsql-loadtest/postgresql-42.7.3.jar,gs://pgsql-loadtest/google-auth-library-oauth2-http-1.16.0.jar,gs://pgsql-loadtest/postgres-socket-factory-1.19.0.jar

//2.This is the code we're testing in local, It's working fine with the username and password WITHOUT IAM auth. But failing when we use IAM based auth.

import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder.appName("Spark to Cloud SQL").getOrCreate()
val data = Seq(("John","Doe","1995-01-05","john.doe@postgresqltutorial.com"))

val df = data.toDF("first_name", "last_name","dob","email")

val url = "jdbc:postgresql://1x.xxx.xx0.x:5432/test"
val driver = "org.postgresql.Driver"

val serviceAccountPath = "/home/xxxxx/tensile-xxxx-xxxxxx-750xxxx74.json"
val serviceAccountText = scala.io.Source.fromFile(serviceAccountPath).mkString

val username = "pgsql-test@tensile-xxxxx-xx.iam"
val password ="Random_text_to_bypass"

//val password ="ACURTBKSLP"

val writeOptions = Map( 
  "url" -> url,
  "driver" -> driver,
  "user" -> username,
  "password" -> password,
  "dbtable" -> "persons_delimit"
)

df.write.format("jdbc").options(writeOptions).mode("overwrite").save()
spark.stop()

//3.Do we have to use the service account token key anywhere? As per the documentation it's not required. should we use the same service account key as GOOGLE_APPLICATION_CREDENTIALS in the local environment where we're running this scala code?

Stacktrace

org.postgresql.util.PSQLException: Something unusual has occurred to cause the driver to fail. Please report this exception.
  at org.postgresql.Driver.connect(Driver.java:321)
  at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:49)
  at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProvider$.create(ConnectionProvider.scala:77)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$createConnectionFactory$1(JdbcUtils.scala:62)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:48)
  at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
  at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
  at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
  at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
  at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:133)
  at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:132)
  at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
  at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
  at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
  at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
  at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:301)
  ... 47 elided
Caused by: java.lang.IllegalArgumentException: Prohibited character

  at org.postgresql.shaded.com.ongres.saslprep.SaslPrep.saslPrep(SaslPrep.java:105)
  at org.postgresql.shaded.com.ongres.scram.common.stringprep.StringPreparations$2.doNormalize(StringPreparations.java:55)
  at org.postgresql.shaded.com.ongres.scram.common.stringprep.StringPreparations.normalize(StringPreparations.java:65)
  at org.postgresql.shaded.com.ongres.scram.common.ScramMechanisms.saltedPassword(ScramMechanisms.java:152)
  at org.postgresql.shaded.com.ongres.scram.common.ScramFunctions.saltedPassword(ScramFunctions.java:59)
  at org.postgresql.shaded.com.ongres.scram.client.ScramSession$ClientFinalProcessor.<init>(ScramSession.java:196)
  at org.postgresql.shaded.com.ongres.scram.client.ScramSession$ClientFinalProcessor.<init>(ScramSession.java:163)
  at org.postgresql.shaded.com.ongres.scram.client.ScramSession$ServerFirstProcessor.clientFinalProcessor(ScramSession.java:130)
  at org.postgresql.jre7.sasl.ScramAuthenticator.processServerFirstMessage(ScramAuthenticator.java:147)
  at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:868)
  at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:207)
  at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:262)
  at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:54)
  at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:273)
  at org.postgresql.Driver.makeConnection(Driver.java:446)
  at org.postgresql.Driver.connect(Driver.java:298)
  ... 72 more

Steps to reproduce?

  1. We have created a service account and created a postgres user with IAM auth option
  2. Provided all privileges to the user in the database ...

Environment

  1. OS type and version:
  2. Java SDK version:
  3. Cloud SQL Java Socket Factory version: 1.19.0

Additional Details

No response

saikrishnab27 commented 4 months ago

@raniksingh

hessjcg commented 4 months ago

This may be a JDBC URL encoding problem in Spark's JDBC setup code. Different application frameworks handle JDBC property values differently. Some escape them before writing a URL, others do not.

Please try manually URL encoding to the user property: replace the @ with a %40.

hessjcg commented 4 months ago

Please reopen this if needed.