snowflakedb / snowpark-python

Snowflake Snowpark Python API
Apache License 2.0
255 stars 106 forks source link

SNOW-735278: Timestamps larger then '2262-04-11 23:47:16' are not converted into Panda's data frames #665

Open drozdse1 opened 1 year ago

drozdse1 commented 1 year ago

Please answer these questions before submitting your issue. Thanks!

  1. What version of Python are you using?

Python 3.8.13 (default, Oct 19 2022, 22:38:03) [MSC v.1916 64 bit (AMD64)]

  1. What operating system and processor architecture are you using?

Windows-10-10.0.19044-SP0

  1. What are the component versions in the environment (pip freeze)?
anyio @ file:///C:/ci/anyio_1644463701441/work/dist
argon2-cffi @ file:///opt/conda/conda-bld/argon2-cffi_1645000214183/work
argon2-cffi-bindings @ file:///C:/ci/argon2-cffi-bindings_1644569878360/work
asn1crypto @ file:///C:/ci/asn1crypto_1652344143274/work
asttokens @ file:///opt/conda/conda-bld/asttokens_1646925590279/work
attrs @ file:///C:/b/abs_09s3y775ra/croot/attrs_1668696195628/work
Babel @ file:///tmp/build/80754af9/babel_1620871417480/work
backcall @ file:///home/ktietz/src/ci/backcall_1611930011877/work
beautifulsoup4 @ file:///C:/ci/beautifulsoup4_1650274792587/work
bleach @ file:///opt/conda/conda-bld/bleach_1641577558959/work
Bottleneck @ file:///C:/Windows/Temp/abs_3198ca53-903d-42fd-87b4-03e6d03a8381yfwsuve8/croots/recipe/bottleneck_1657175565403/work
brotlipy==0.7.0
certifi @ file:///C:/b/abs_ac29jvt43w/croot/certifi_1665076682579/work/certifi
cffi @ file:///C:/Windows/Temp/abs_6808y9x40v/croots/recipe/cffi_1659598653989/work
charset-normalizer @ file:///tmp/build/80754af9/charset-normalizer_1630003229654/work
cloudpickle @ file:///tmp/build/80754af9/cloudpickle_1632508026186/work
colorama @ file:///C:/Windows/TEMP/abs_9439aeb1-0254-449a-96f7-33ab5eb17fc8apleb4yn/croots/recipe/colorama_1657009099097/work
cryptography @ file:///C:/ci/cryptography_1639472366776/work
debugpy @ file:///C:/ci/debugpy_1637073815078/work
decorator @ file:///opt/conda/conda-bld/decorator_1643638310831/work
defusedxml @ file:///tmp/build/80754af9/defusedxml_1615228127516/work
entrypoints @ file:///C:/ci/entrypoints_1649926621247/work
executing @ file:///opt/conda/conda-bld/executing_1646925071911/work
fastjsonschema @ file:///C:/Users/BUILDE~1/AppData/Local/Temp/abs_ebruxzvd08/croots/recipe/python-fastjsonschema_1661376484940/work
filelock @ file:///opt/conda/conda-bld/filelock_1647002191454/work
idna @ file:///C:/b/abs_bdhbebrioa/croot/idna_1666125572046/work
importlib-metadata @ file:///C:/ci/importlib-metadata_1648544472910/work
importlib-resources @ file:///tmp/build/80754af9/importlib_resources_1625135880749/work
ipykernel @ file:///C:/b/abs_21ykzkm7y_/croots/recipe/ipykernel_1662361803478/work
ipython @ file:///C:/b/abs_84shimxwmz/croot/ipython_1668088123494/work
ipython-genutils @ file:///tmp/build/80754af9/ipython_genutils_1606773439826/work
ipywidgets @ file:///tmp/build/80754af9/ipywidgets_1634143127070/work
jedi @ file:///C:/ci/jedi_1644315425835/work
Jinja2 @ file:///C:/b/abs_7cdis66kl9/croot/jinja2_1666908141852/work
json5 @ file:///tmp/build/80754af9/json5_1624432770122/work
jsonschema @ file:///C:/b/abs_59eyhnbyej/croots/recipe/jsonschema_1663375476535/work
jupyter @ file:///C:/Windows/TEMP/abs_56xfdi__li/croots/recipe/jupyter_1659349053177/work
jupyter-console @ file:///opt/conda/conda-bld/jupyter_console_1647002188872/work
jupyter-server @ file:///C:/Windows/TEMP/abs_d3c42c59-765d-4f9b-9fa3-ad5b1369485611i_yual/croots/recipe/jupyter_server_1658754493238/work
jupyter_client @ file:///C:/b/abs_cbnezz4zg0/croot/jupyter_client_1669040266195/work
jupyter_core @ file:///C:/b/abs_b8r5lb5eus/croot/jupyter_core_1668084445779/work
jupyterlab @ file:///C:/ci/jupyterlab_1658909327110/work
jupyterlab-pygments @ file:///tmp/build/80754af9/jupyterlab_pygments_1601490720602/work
jupyterlab-widgets @ file:///tmp/build/80754af9/jupyterlab_widgets_1609884341231/work
jupyterlab_server @ file:///C:/ci/jupyterlab_server_1664911423754/work
lxml @ file:///C:/ci/lxml_1657527495424/work
MarkupSafe @ file:///C:/ci/markupsafe_1654489871526/work
matplotlib-inline @ file:///C:/ci/matplotlib-inline_1661934035815/work
mistune==0.8.4
mkl-fft==1.3.1
mkl-random @ file:///C:/ci/mkl_random_1626186184278/work
mkl-service==2.4.0
nbclassic @ file:///C:/b/abs_26e3fkk516/croot/nbclassic_1668174974037/work
nbclient @ file:///C:/ci/nbclient_1650290386732/work
nbconvert @ file:///C:/b/abs_4av3q4okro/croot/nbconvert_1668450658054/work
nbformat @ file:///C:/b/abs_1dw90o2uqb/croots/recipe/nbformat_1663744957967/work
nest-asyncio @ file:///C:/ci/nest-asyncio_1649829929372/work
notebook @ file:///C:/b/abs_ca13hqvuzw/croot/notebook_1668179888546/work
notebook_shim @ file:///C:/b/abs_ebfczttg6x/croot/notebook-shim_1668160590914/work
numexpr @ file:///C:/Windows/Temp/abs_e2036a32-9fe9-47f3-a04c-dbb1c232ba4b334exiur/croots/recipe/numexpr_1656940304835/work
numpy @ file:///C:/b/abs_5ct9ex77k9/croot/numpy_and_numpy_base_1668593740598/work
oscrypto @ file:///tmp/build/80754af9/oscrypto_1633350059025/work
packaging @ file:///tmp/build/80754af9/packaging_1637314298585/work
pandas @ file:///C:/b/abs_cdcgk91igc/croots/recipe/pandas_1663772960432/work
pandocfilters @ file:///opt/conda/conda-bld/pandocfilters_1643405455980/work
parso @ file:///opt/conda/conda-bld/parso_1641458642106/work
pickleshare @ file:///tmp/build/80754af9/pickleshare_1606932040724/work
pkgutil_resolve_name @ file:///C:/Users/BUILDE~1/AppData/Local/Temp/abs_81wm45v3kb/croots/recipe/pkgutil-resolve-name_1661463352381/work
ply==3.11
prometheus-client @ file:///C:/Windows/TEMP/abs_ab9nx8qb08/croots/recipe/prometheus_client_1659455104602/work
prompt-toolkit @ file:///tmp/build/80754af9/prompt-toolkit_1633440160888/work
psutil @ file:///C:/Windows/Temp/abs_b2c2fd7f-9fd5-4756-95ea-8aed74d0039flsd9qufz/croots/recipe/psutil_1656431277748/work
pure-eval @ file:///opt/conda/conda-bld/pure_eval_1646925070566/work
pyarrow==8.0.0
pycparser @ file:///tmp/build/80754af9/pycparser_1636541352034/work
pycryptodomex @ file:///C:/Users/BUILDE~1/AppData/Local/Temp/abs_67dewsf1d3/croots/recipe/pycryptodomex_1661442964943/work
Pygments @ file:///opt/conda/conda-bld/pygments_1644249106324/work
PyJWT @ file:///C:/ci/pyjwt_1657529430378/work
pyOpenSSL @ file:///opt/conda/conda-bld/pyopenssl_1643788558760/work
pyparsing @ file:///C:/Users/BUILDE~1/AppData/Local/Temp/abs_7f_7lba6rl/croots/recipe/pyparsing_1661452540662/work
PyQt5==5.15.7
PyQt5-sip @ file:///C:/Windows/Temp/abs_d7gmd2jg8i/croots/recipe/pyqt-split_1659273064801/work/pyqt_sip
pyrsistent @ file:///C:/ci/pyrsistent_1636111468851/work
PySocks @ file:///C:/ci/pysocks_1605287845585/work
python-dateutil @ file:///tmp/build/80754af9/python-dateutil_1626374649649/work
pytz @ file:///C:/Windows/TEMP/abs_90eacd4e-8eff-491e-b26e-f707eba2cbe1ujvbhqz1/croots/recipe/pytz_1654762631027/work
pywin32==302
pywinpty @ file:///C:/ci_310/pywinpty_1644230983541/work/target/wheels/pywinpty-2.0.2-cp38-none-win_amd64.whl
pyzmq @ file:///C:/ci/pyzmq_1657616005830/work
qtconsole @ file:///C:/ci/qtconsole_1662018992304/work
QtPy @ file:///C:/ci/qtpy_1662015036641/work
requests @ file:///C:/ci/requests_1657717096906/work
Send2Trash @ file:///tmp/build/80754af9/send2trash_1632406701022/work
sip @ file:///C:/Windows/Temp/abs_b8fxd17m2u/croots/recipe/sip_1659012372737/work
six @ file:///tmp/build/80754af9/six_1644875935023/work
sniffio @ file:///C:/ci/sniffio_1614030707456/work
snowflake-connector-python @ file:///C:/b/abs_34hefeqz_s/croots/recipe/snowflake-connector-python_1662316549484/work
snowflake-snowpark-python @ file:///C:/b/abs_c9l6p0q60d/croot/snowflake-snowpark-python_1667400726103/work
soupsieve @ file:///C:/b/abs_fasraqxhlv/croot/soupsieve_1666296394662/work
stack-data @ file:///opt/conda/conda-bld/stack_data_1646927590127/work
terminado @ file:///C:/ci/terminado_1644322757089/work
tinycss2 @ file:///C:/b/abs_52w5vfuaax/croot/tinycss2_1668168823131/work
toml @ file:///tmp/build/80754af9/toml_1616166611790/work
tornado @ file:///C:/ci/tornado_1662476991259/work
traitlets @ file:///tmp/build/80754af9/traitlets_1636710298902/work
typing_extensions @ file:///C:/Windows/TEMP/abs_dd2d0moa85/croots/recipe/typing_extensions_1659638831135/work
urllib3 @ file:///C:/b/abs_a8_3vfznn_/croot/urllib3_1666298943664/work
wcwidth @ file:///Users/ktietz/demo/mc3/conda-bld/wcwidth_1629357192024/work
webencodings==0.5.1
websocket-client @ file:///C:/ci/websocket-client_1614804473297/work
widgetsnbextension @ file:///C:/ci/widgetsnbextension_1645009558218/work
win-inet-pton @ file:///C:/ci/win_inet_pton_1605306167264/work
wincertstore==0.2
zipp @ file:///C:/ci/zipp_1652274073489/work
  1. What did you do?

session.sql("SELECT CAST('2262-04-11 23:48:17' AS TIMESTAMP) AS T").to_pandas()

  1. What did you expect to see?
    T
0   2262-04-12 23:48:17
  1. Can you set logging to DEBUG and collect the logs?
2023-01-31 16:04:06,730 - MainThread cursor.py:630 - execute() - DEBUG - executing SQL/command
2023-01-31 16:04:06,731 - MainThread cursor.py:715 - execute() - INFO - query: [SELECT CAST('2262-04-11 23:47:17' AS TIMESTAMP) AS T]
2023-01-31 16:04:06,736 - MainThread connection.py:1302 - _next_sequence_counter() - DEBUG - sequence counter: 37
2023-01-31 16:04:06,737 - MainThread cursor.py:460 - _execute_helper() - DEBUG - Request id: 23301621-a3ec-4e1f-bfba-2ebafe59279e
2023-01-31 16:04:06,738 - MainThread cursor.py:462 - _execute_helper() - DEBUG - running query [SELECT CAST('2262-04-11 23:47:17' AS TIMESTAMP) AS T]
2023-01-31 16:04:06,738 - MainThread cursor.py:471 - _execute_helper() - DEBUG - is_file_transfer: False
2023-01-31 16:04:06,739 - MainThread connection.py:972 - cmd_query() - DEBUG - _cmd_query
2023-01-31 16:04:06,739 - MainThread connection.py:995 - cmd_query() - DEBUG - sql=[SELECT CAST('2262-04-11 23:47:17' AS TIMESTAMP) AS T], sequence_id=[37], is_file_transfer=[False]
2023-01-31 16:04:06,740 - MainThread network.py:1147 - _use_requests_session() - DEBUG - Session status for SessionPool 'xxx.east-us-2.azure.snowflakecomputing.com', SessionPool 1/1 active sessions
2023-01-31 16:04:06,741 - MainThread network.py:827 - _request_exec_wrapper() - DEBUG - remaining request timeout: None, retry cnt: 1
2023-01-31 16:04:06,741 - MainThread network.py:808 - add_request_guid() - DEBUG - Request guid: a1b16f5f-29a0-42fc-a1df-08c7cca41cf2
2023-01-31 16:04:06,744 - MainThread network.py:1006 - _request_exec() - DEBUG - socket timeout: 60
2023-01-31 16:04:06,936 - MainThread connectionpool.py:456 - _make_request() - DEBUG - [https://xxx.east-us-2.azure.snowflakecomputing.com:443](https://xxx.east-us-2.azure.snowflakecomputing.com/) "POST /queries/v1/query-request?requestId=23301621-a3ec-4e1f-bfba-2ebafe59279e&request_guid=a1b16f5f-29a0-42fc-a1df-08c7cca41cf2 HTTP/1.1" 200 None
2023-01-31 16:04:06,947 - MainThread network.py:1032 - _request_exec() - DEBUG - SUCCESS
2023-01-31 16:04:06,950 - MainThread network.py:1152 - _use_requests_session() - DEBUG - Session status for SessionPool 'xxx.east-us-2.azure.snowflakecomputing.com', SessionPool 0/1 active sessions
2023-01-31 16:04:06,950 - MainThread network.py:715 - _post_request() - DEBUG - ret[code] = None, after post request
2023-01-31 16:04:06,951 - MainThread network.py:739 - _post_request() - DEBUG - Query id: 01aa0508-0b04-225a-0000-a2f90ee8254a
2023-01-31 16:04:06,952 - MainThread cursor.py:737 - execute() - DEBUG - sfqid: 01aa0508-0b04-225a-0000-a2f90ee8254a
2023-01-31 16:04:06,952 - MainThread cursor.py:739 - execute() - INFO - query execution done
2023-01-31 16:04:06,953 - MainThread cursor.py:741 - execute() - DEBUG - SUCCESS
2023-01-31 16:04:06,953 - MainThread cursor.py:744 - execute() - DEBUG - PUT OR GET: None
2023-01-31 16:04:06,954 - MainThread cursor.py:841 - _init_result_and_meta() - DEBUG - Query result format: arrow
2023-01-31 16:04:06,954 - MainThread server_connection.py:321 - run_query() - DEBUG - Execute query [queryID: 01aa0508-0b04-225a-0000-a2f90ee8254a] SELECT CAST('2262-04-11 23:47:17' AS TIMESTAMP) AS T
2023-01-31 16:04:06,955 - MainThread arrow_iterator.cp38-win_amd64.pyd:0 - __cinit__() - DEBUG - Batches read: 1
2023-01-31 16:04:06,955 - MainThread CArrowIterator.cpp:16 - CArrowIterator() - DEBUG - Arrow BatchSize: 1
---------------------------------------------------------------------------
ArrowInvalid                              Traceback (most recent call last)
File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:371, in ServerConnection._to_data_or_iter(self, results_cursor, to_pandas, to_iter)
    361 try:
    362     data_or_iter = (
    363         map(
    364             functools.partial(
    365                 _fix_pandas_df_integer, results_cursor=results_cursor
    366             ),
    367             results_cursor.fetch_pandas_batches(),
    368         )
    369         if to_iter
    370         else _fix_pandas_df_integer(
--> 371             results_cursor.fetch_pandas_all(), results_cursor
    372         )
    373     )
    374 except NotSupportedError:

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\cursor.py:990, in SnowflakeCursor.fetch_pandas_all(self, **kwargs)
    987 self._log_telemetry_job_data(
    988     TelemetryField.PANDAS_FETCH_ALL, TelemetryData.TRUE
    989 )
--> 990 return self._result_set._fetch_pandas_all(**kwargs)

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\result_set.py:185, in ResultSet._fetch_pandas_all(self, **kwargs)
    184 """Fetches a single Pandas dataframe."""
--> 185 dataframes = list(self._fetch_pandas_batches())
    186 if dataframes:

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\result_set.py:181, in ResultSet._fetch_pandas_batches(self, **kwargs)
    180 self._can_create_arrow_iter()
--> 181 return self._create_iter(iter_unit=IterUnit.TABLE_UNIT, structure="pandas")

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\result_set.py:223, in ResultSet._create_iter(self, **kwargs)
    221 kwargs["connection"] = self._cursor.connection
--> 223 first_batch_iter = self.batches[0].create_iter(**kwargs)
    225 # Iterator[Tuple] Futures that have not been consumed by the user

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\result_batch.py:696, in ArrowResultBatch.create_iter(self, connection, **kwargs)
    695 if structure == "pandas":
--> 696     return self._get_pandas_iter(connection=connection, **kwargs)
    697 else:

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\result_batch.py:678, in ArrowResultBatch._get_pandas_iter(self, connection, **kwargs)
    677 iterator_data = []
--> 678 dataframe = self.to_pandas(connection=connection, **kwargs)
    679 if not dataframe.empty:

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\connector\result_batch.py:671, in ArrowResultBatch.to_pandas(self, connection, **kwargs)
    670 table = self.to_arrow(connection=connection)
--> 671 return table.to_pandas(**kwargs)

File ~\Anaconda3\envs\py38_env\lib\site-packages\pyarrow\array.pxi:822, in pyarrow.lib._PandasConvertible.to_pandas()

File ~\Anaconda3\envs\py38_env\lib\site-packages\pyarrow\table.pxi:3889, in pyarrow.lib.Table._to_pandas()

File ~\Anaconda3\envs\py38_env\lib\site-packages\pyarrow\pandas_compat.py:803, in table_to_blockmanager(options, table, categories, ignore_metadata, types_mapper)
    802 columns = _deserialize_column_index(table, all_columns, column_indexes)
--> 803 blocks = _table_to_blocks(options, table, categories, ext_columns_dtypes)
    805 axes = [columns, index]

File ~\Anaconda3\envs\py38_env\lib\site-packages\pyarrow\pandas_compat.py:1153, in _table_to_blocks(options, block_table, categories, extension_columns)
   1152 columns = block_table.column_names
-> 1153 result = pa.lib.table_to_blocks(options, block_table, categories,
   1154                                 list(extension_columns.keys()))
   1155 return [_reconstruct_block(item, columns, extension_columns)
   1156         for item in result]

File ~\Anaconda3\envs\py38_env\lib\site-packages\pyarrow\table.pxi:2602, in pyarrow.lib.table_to_blocks()

File ~\Anaconda3\envs\py38_env\lib\site-packages\pyarrow\error.pxi:100, in pyarrow.lib.check_status()

ArrowInvalid: Casting from timestamp[us] to timestamp[ns] would result in out of bounds timestamp: 9223372037000000

During handling of the above exception, another exception occurred:

SnowparkFetchDataException                Traceback (most recent call last)
Cell In [42], line 1
----> 1 session.sql("SELECT CAST('2262-04-11 23:47:17' AS TIMESTAMP) AS T").to_pandas()

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\telemetry.py:138, in df_collect_api_telemetry.<locals>.wrap(*args, **kwargs)
    135 @functools.wraps(func)
    136 def wrap(*args, **kwargs):
    137     with args[0]._session.query_history() as query_history:
--> 138         result = func(*args, **kwargs)
    139     plan = args[0]._select_statement or args[0]._plan
    140     api_calls = [
    141         *plan.api_calls,
    142         {TelemetryField.NAME.value: f"DataFrame.{func.__name__}"},
    143     ]

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\dataframe.py:707, in DataFrame.to_pandas(self, statement_params, block, **kwargs)
    681 @df_collect_api_telemetry
    682 def to_pandas(
    683     self,
   (...)
    687     **kwargs: Dict[str, Any],
    688 ) -> Union["pandas.DataFrame", AsyncJob]:
    689     """
    690     Executes the query representing this DataFrame and returns the result as a
    691     `Pandas DataFrame <https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html>`_.
   (...)
    705         :func:`Session.sql` can only be a SELECT statement.
    706     """
--> 707     result = self._session._conn.execute(
    708         self._plan,
    709         to_pandas=True,
    710         block=block,
    711         data_type=_AsyncResultType.PANDAS,
    712         _statement_params=create_or_update_statement_params_with_query_tag(
    713             statement_params, self._session.query_tag, SKIP_LEVELS_TWO
    714         ),
    715         **kwargs,
    716     )
    718     # if the returned result is not a pandas dataframe, raise Exception
    719     # this might happen when calling this method with non-select commands
    720     # e.g., session.sql("create ...").to_pandas()
    721     if block:

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:406, in ServerConnection.execute(self, plan, to_pandas, to_iter, block, data_type, **kwargs)
    402 if is_in_stored_procedure() and not block:  # pragma: no cover
    403     raise NotImplementedError(
    404         "Async query is not supported in stored procedure yet"
    405     )
--> 406 result_set, result_meta = self.get_result_set(
    407     plan, to_pandas, to_iter, **kwargs, block=block, data_type=data_type
    408 )
    409 if not block:
    410     return result_set

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\analyzer\snowflake_plan.py:87, in SnowflakePlan.Decorator.wrap_exception.<locals>.wrap(*args, **kwargs)
     85 def wrap(*args, **kwargs):
     86     try:
---> 87         return func(*args, **kwargs)
     88     except snowflake.connector.errors.ProgrammingError as e:
     89         tb = sys.exc_info()[2]

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:494, in ServerConnection.get_result_set(self, plan, to_pandas, to_iter, block, data_type, **kwargs)
    492 for holder, id_ in placeholders.items():
    493     final_query = final_query.replace(holder, id_)
--> 494 result = self.run_query(
    495     final_query,
    496     to_pandas,
    497     to_iter and (i == len(plan.queries) - 1),
    498     is_ddl_on_temp_object=query.is_ddl_on_temp_object,
    499     block=not is_last,
    500     data_type=data_type,
    501     async_job_plan=plan,
    502     **kwargs,
    503 )
    504 placeholders[query.query_id_place_holder] = (
    505     result["sfqid"] if not is_last else result.query_id
    506 )
    507 result_meta = self._cursor.description

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:104, in ServerConnection._Decorator.wrap_exception.<locals>.wrap(*args, **kwargs)
    100     raise SnowparkClientExceptionMessages.SERVER_SESSION_EXPIRED(
    101         ex.cause
    102     )
    103 except Exception as ex:
--> 104     raise ex

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:98, in ServerConnection._Decorator.wrap_exception.<locals>.wrap(*args, **kwargs)
     96     raise SnowparkClientExceptionMessages.SERVER_SESSION_HAS_BEEN_CLOSED()
     97 try:
---> 98     return func(*args, **kwargs)
     99 except ReauthenticationRequest as ex:
    100     raise SnowparkClientExceptionMessages.SERVER_SESSION_EXPIRED(
    101         ex.cause
    102     )

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:341, in ServerConnection.run_query(self, query, to_pandas, to_iter, is_ddl_on_temp_object, block, data_type, async_job_plan, **kwargs)
    335 # fetch_pandas_all/batches() only works for SELECT statements
    336 # We call fetchall() if fetch_pandas_all/batches() fails,
    337 # because when the query plan has multiple queries, it will
    338 # have non-select statements, and it shouldn't fail if the user
    339 # calls to_pandas() to execute the query.
    340 if block:
--> 341     return self._to_data_or_iter(
    342         results_cursor=results_cursor, to_pandas=to_pandas, to_iter=to_iter
    343     )
    344 else:
    345     return AsyncJob(
    346         results_cursor["queryId"],
    347         query,
   (...)
    351         **kwargs,
    352     )

File ~\Anaconda3\envs\py38_env\lib\site-packages\snowflake\snowpark\_internal\server_connection.py:381, in ServerConnection._to_data_or_iter(self, results_cursor, to_pandas, to_iter)
    379         raise
    380     except BaseException as ex:
--> 381         raise SnowparkClientExceptionMessages.SERVER_FAILED_FETCH_PANDAS(
    382             str(ex)
    383         )
    384 else:
    385     data_or_iter = (
    386         iter(results_cursor) if to_iter else results_cursor.fetchall()
    387     )

SnowparkFetchDataException: (1406): Failed to fetch a Pandas Dataframe. The error is: Casting from timestamp[us] to timestamp[ns] would result in out of bounds timestamp: 9223372037000000
sfc-gh-jdu commented 1 year ago

This is caused by https://github.com/snowflakedb/snowflake-connector-python/issues/1378 and we will be working on it in next few weeks