Azure / azure-sqldb-spark

This project provides a client library that allows Azure SQL DB or SQL Server to act as an input source or output sink for Spark jobs.
MIT License
75 stars 52 forks source link

sqlContext.sqlDBQuery() throws error when queryCustom contains text literals in single quote #75

Closed rigpat closed 4 years ago

rigpat commented 4 years ago

Environment:

Issue Component: SQLContext.sqlDBQuery()

Issue Summary: even valid sql-query passed to queryCustom if contains string literal enclosed in single quotes, there is an error

Error Messages:

com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near 'Alfred Schmidt'
...
at com.microsoft.azure.sqldb.spark.query.QueryFunctions.sqlDBQuery(QueryFunctions.scala:57)

Sample Code to recreate Error:

import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.query._
val query = """
              |UPDATE Customers
              |SET ContactName = 'Alfred Schmidt', City = 'Frankfurt'
              |WHERE CustomerID = 1;
            """.stripMargin

val config = Config(Map(
  "url"          -> "mysqlserver.database.windows.net",
  "databaseName" -> "MyDatabase",
  "user"         -> "username",
  "password"     -> "*********",
  "connectTimeout" -> "60", //seconds -- 1 min
  "queryTimeout" -> "7200",  //seconds -- 2 hours
  "bulkCopyBatchSize" -> "2500", //rows
  "bulkCopyTableLock" -> "true", //lock table for bulk insert
  "bulkCopyTimeout"   -> "7200" //seconds -- 2 hours

  "queryCustom"  -> query
))

sqlContext.sqlDBQuery(config)

Description:

arvindshmicrosoft commented 4 years ago

Unfortunately, this example did not reproduce the expected error (tested with DBR 6.4 and with 6.6). Even if it did reproduce an error, there are no plans to address it in this connector as it is no longer maintained. However, Apache Spark Connector for SQL Server and Azure SQL is now available, with support for Python and R bindings, an easier-to use interface to bulk insert data, and many other improvements. This repo will soon be archived (set to read-only). We encourage you to actively evaluate and use the new connector.