pingcap / tispark

TiSpark is built for running Apache Spark on top of TiDB/TiKV
Apache License 2.0
880 stars 244 forks source link

[feature] Support syntax "select a from t partition(p1)" #1977

Open LittleFall opened 3 years ago

LittleFall commented 3 years ago

Is your feature request related to a problem? Please describe. Currently, TiSpark doesn't support a MySQL/TiDB partition table syntax select col_name from table_name partition(partition_name)

spark.sql("select a from t where partition(p0)").show(false)
org.apache.spark.sql.AnalysisException: Undefined function: 'partition'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 22
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15$$anonfun$applyOrElse$51.apply(Analyzer.scala:1395)
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15$$anonfun$applyOrElse$51.apply(Analyzer.scala:1395)
  at org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:53)
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15.applyOrElse(Analyzer.scala:1394)
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15.applyOrElse(Analyzer.scala:1386)
  at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:258)
  at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:258)
...

docs: https://github.com/pingcap/tispark/pull/1976

Describe the solution you'd like Support this syntax.

Describe alternatives you've considered We can still use where condition to filter the partitions.

scala> spark.sql("select a from t where a<100").show(false)
21/03/23 10:24:44 WARN ObjectStore: Failed to get database test, returning NoSuchObjectException
21/03/23 10:24:44 WARN ObjectStore: Failed to get database test, returning NoSuchObjectException
+---+
|a  |
+---+
+---+

Additional context Add any other context or screenshots about the feature request here.

shiyuhang0 commented 2 years ago

Sorry for the late reply. Sadly, we don't support it for spark SQL doesn't support this syntax.