[ ] Test permerge with a time zone Env variable, by default it's UTC.
import os
def get_time_zone_for_testing():
tz = os.getenv('TEST_SPARK_SESSION_TIMEZONE')
if tz is None:
tz = "UTC"
else:
tz = tz.strip()
return tz
# spark_session.py
tz = get_time_zone_for_testing()
_spark.conf.set("spark.sql.session.timeZone", tz)
# conftest.py
if (item.get_closest_marker('disable_timezone_test') and
get_time_zone_for_testing() != "UTC"):
// for cases disable non-utc test and current timezone for testing is non-utc, skip them.
pytest.skip('Skip because this case is not ready for non UTC time zone')
[ ] Update CI to testing non UTC for some Spark versions, e.g.: only for Spark 311 and Spark 350
[ ] For the cases with 'disable_timezone_test' mark, these cases was skipped when timezone is non UTC. We should confirm they fallbacks to CPU when timezone is non UTC either by manually way or write corresponding case for each case that has 'disable_timezone_test' mark
Additional context
Finally we should remove the fallback cases that testing non UTC.
Describe the bug
After https://github.com/NVIDIA/spark-rapids/pull/9482 is merged, should further finish the following items:
[ ] Update CI to testing non UTC for some Spark versions, e.g.: only for Spark 311 and Spark 350
[ ] For the cases with 'disable_timezone_test' mark, these cases was skipped when timezone is non UTC. We should confirm they fallbacks to CPU when timezone is non UTC either by manually way or write corresponding case for each case that has 'disable_timezone_test' mark
Additional context Finally we should remove the fallback cases that testing non UTC.