What steps will reproduce the problem?
1. Create a source table with a Hive reserved keyword (e.g. comment )
2. It will be replicated to a staging file in Hadoop
3. Try to load it with load-reduce-check
What is the expected output?
It should load properly
What do you see instead?
An error about incorrect character in the sql generated by ddlscan when loading
the final table
What version of the product are you using?
3.0
On what operating system?
Hortonworks
Please provide any additional information below.
Notes:
I could solve this issue by modifying ddl-mysql-hive-metadata.vm to add
backticks around hive reserved words instead of an underscore in front of it.
Is there any way of making the tool load-reduce-check use the .csv file for
renaming columns?
Thanks and regards
Original issue reported on code.google.com by josemiq...@gmail.com on 23 Apr 2015 at 8:05
Original issue reported on code.google.com by
josemiq...@gmail.com
on 23 Apr 2015 at 8:05