uber / queryparser

Parsing and analysis of Vertica, Hive, and Presto SQL.
MIT License
1.08k stars 147 forks source link

Add Support for Teradata Dialect #53

Open TWood67 opened 5 years ago

TWood67 commented 5 years ago

Hello! I am looking to parse and analyze Teradata queries so that we can better understand what tables and columns are used. After some googling I was happy to find that you all had a similar problem and created a solution! Unfortunately, the Teradata dialect is not supported and I have no experience with Haskell. We've already built a proof of concept using antlr4, but I would prefer to use or build on an existing solution. That being said, I have a few questions.

  1. The article that led me here mentioned not many engineers had experience with Haskell and most learned it to work on this project. What was this process like? Was everyone able to pick it up quickly and start developing? I'm hoping to understand what kind of overhead I'll incur by going down this route.
  2. When do you anticipate that #36 will be completed? I'm hoping this will give some answers to my first question.

Thanks!

dlthomas commented 5 years ago

https://github.com/LeapYear/queryparser/commits/leapyear-master has some Teradata support, although I broke the catalog defaulting, so depending on your needs there may still be work to do.

In answer to your questions:

  1. With an experienced Haskeller to coach them, onboarding to the language was pretty straightforward. Both of the previously-non-Haskellers had significant experience with Clojure and one had some experience with OCaml, so there were fewer new concepts than there might've been. And there's always more Haskell to learn - they were quickly productive but in relatively well-defined spaces.

  2. My guess would be "when someone submits a pull request." I've not been at Uber for a while now, but I think resourcing of the project has been very narrow and needs-based for some time. I'd be happy to be corrected, of course :)

TWood67 commented 5 years ago

This is great, thanks @dlthomas!