The default date time parser simply provides the value to the Date constructor, which yields epoch time (1970-01-01T00:00:00.000Z) instead of null for null values.
A more appropriate dateTimeParser we use in our project is:
result.primaryResults[0].dateTimeParser = (val: null | string) => val === null || val === undefined || val === '' ? null : new Date(val);
data = result.primaryResults[0].toJSON().data;
I'm not sure what default value I would like to see, but this frustrated us and I wanted to open it here to consider whether this is a JavaScript-unique behavior, and what the other SDKs might do in a similar situation?
Expected
When querying over a table that has a datetime column, if that value is null, it is deserialized as null or undefined by default.
Thanks, @jeffwilcox .
This would account for a breaking API change.
We'll consider adding it, either under a control flag or as part of a breaking change.
The default date time parser simply provides the value to the Date constructor, which yields epoch time (
1970-01-01T00:00:00.000Z
) instead of null for null values.A more appropriate dateTimeParser we use in our project is:
I'm not sure what default value I would like to see, but this frustrated us and I wanted to open it here to consider whether this is a JavaScript-unique behavior, and what the other SDKs might do in a similar situation?
Expected
When querying over a table that has a
datetime
column, if that value is null, it is deserialized as null or undefined by default.(!result.primaryResults[0].toJSON().data[0].myNullDateTimeValue) === true
Actual
(!result.primaryResults[0].toJSON().data[0].myNullDateTimeValue).toISOString() === '1970-01-01T00:00:00.000Z'