Is your feature request related to a problem? Please describe.
I would like to run data quality checks such as Maximum() or Minimum() on datetime columns (DateType or Timestamp type in Spark).
Describe the solution you'd like
Add support for datetime columns in the API (ConstrainableDataTypes, Analyzers, Profilers).
Describe alternatives you've considered
I'm currently converting datetime columns to a numeric datatype (unix_timestamp), but it would be great if we could use the API on the original columns, in order to avoid conversion bugs/issues.
Is your feature request related to a problem? Please describe. I would like to run data quality checks such as Maximum() or Minimum() on datetime columns (DateType or Timestamp type in Spark).
Describe the solution you'd like Add support for datetime columns in the API (ConstrainableDataTypes, Analyzers, Profilers).
Describe alternatives you've considered I'm currently converting datetime columns to a numeric datatype (unix_timestamp), but it would be great if we could use the API on the original columns, in order to avoid conversion bugs/issues.
Thanks for considering my request! 🙂