Teradata / PyTd

A Python Module to make it easy to script powerful interactions with Teradata Database in a DevOps friendly way.
MIT License
108 stars 43 forks source link

Expensive logging calls during `convertValue` degrade large query performance #85

Open tetraptych opened 6 years ago

tetraptych commented 6 years ago

The method convertValue of DefaultDataTypeConverter contains the following line (L230-L231):

        logger.trace(
            "Converting \"%s\" to (%s, %s).", value, dataType, typeCode)

This results in many calls to logger.isEnabledFor, even when configureLogging=False in the application configuration.

For context: I am fetching ~80,000 rows with 30 columns each for a total of 2,400,000 type conversions. Even if each of these logging calls is very fast the time becomes significant given the volume of data. cProfile shows that ~36% of the time in convertValue was spent in util.py:41(trace).

Maybe a fix similar to the one used for issue #37 could work?