Closed kozanitis closed 9 years ago
Is there a way to escape the field names in Spark SQL? My preference would be against changing the end
fields unless absolutely necessary.
No worries then... I just verified that you can escape column names using end
(same convention as in HiveQL)
Encountered this myself recently, hence the reason we've been using "stop" instead of "end". "stop" also seems to go along with "start" just a little more IMHO. Really wish SQL reserved keywords were taken into consideration. Its a real pain to escape these in SQL statements...
I have trouble accessing those "end" fields (e.g. AlignmentRecord.end, variant.end) with sparkSql because end is a reserved keyword there and it conflicts with the field names.
I was wondering: Is it possible to assign different names to those fields?