iiasa / ixmp

The ix modeling platform for integrated and cross-cutting scenario analysis
https://docs.messageix.org/ixmp
Apache License 2.0
38 stars 112 forks source link

Cannot commit after add_timeseries() call #442

Closed Jihoon closed 2 years ago

Jihoon commented 2 years ago

I get an unhandled Java exception during the reporting process of a certain scenario. This error might be related to this issue resolved in the past.


Code sample or context

The error is coming directly from this line: https://github.com/iiasa/message_data/blob/f813f584334417986bc4eb8c2555483ea44b0535/message_data/tools/post_processing/iamc_report_hackathon.py#L392

To replicate that, try this;

import ixmp
import message_ix

mp_ENE = ixmp.Platform("ixmp_dev")
sc_debug = message_ix.Scenario(mp_ENE, "MESSAGEix-GLOBIOM 1.1-M-R12-NGFS", "o_1p5C_NPi2020_600_step1", version=13)

def combined_report(scenario, key, **kwargs):

    '''This combines variant-specific reporting with the standard reporting.
    key: M/BM/BMT to turn on/off corresponding reporting functions
    '''

    from message_data.reporting.materials.reporting import report
    from message_data.tools.post_processing.iamc_report_hackathon import report as reporting

    # Remove existing timeseries and add material timeseries
    print("Reporting material-specific variables")
    print("Now reporting", scenario.model, "/", scenario.scenario, "/", scenario.version)

    if 'M' in key:
        report(scenario, "False")

    # Add timeseries from buildings module
    if 'B' in key:
        add_building_ts(scenario)

    if "legacy" in kwargs:
        legacy_args.update(**kwargs["legacy"])

        print("legacy_args", legacy_args) 
        return reporting(
            mp=scenario.platform,
            scen=scenario,
            **legacy_args,
        )

legacy_args = dict(ref_sol=False, merge_ts=True, run_config="ngfs_run_config.yaml")
combined_report(sc_debug, key = 'M', legacy = legacy_args)

Expected result

For other usual scenarios, it spits out one xlsx output file without the error.

Problem description

Instead, it stops with the following error message.

    region                                variable   unit       2025  ...       2080       2090       2100       2110
0  R12_AFR                     Resource|Extraction  EJ/yr  13.521929  ...  14.861251  10.521391  12.236355  15.227066
1  R12_AFR                Resource|Extraction|Coal  EJ/yr   2.436015  ...   0.317465   0.159189   0.070001   0.130232
2  R12_AFR                 Resource|Extraction|Gas  EJ/yr   4.019069  ...  13.816138  10.195092  12.162145  15.091134
3  R12_AFR    Resource|Extraction|Gas|Conventional  EJ/yr   4.019069  ...  13.816138  10.195092   5.427615  10.134519
4  R12_AFR  Resource|Extraction|Gas|Unconventional  EJ/yr   0.000000  ...   0.000000   0.000000   6.734530   4.956614
​
[5 rows x 16 columns]
Finished uploading timeseries
Traceback (most recent call last):
  File "C:\ProgramData\Anaconda3\envs\local_mix\Scripts\mix-models-script.py", line 33, in <module>
    sys.exit(load_entry_point('message-ix-models', 'console_scripts', 'mix-models')())
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\click\decorators.py", line 38, in new_func
    return f(get_current_context().obj, *args, **kwargs)
  File "\\hdrive\home$\u045\min\mydocuments\message\message_data\message_data\projects\ngfs\__init__.py", line 118, in run
    main(context)
  File "\\hdrive\home$\u045\min\mydocuments\message\message_data\message_data\projects\ngfs\runscript_main.py", line 266, in main
    rs.run(
  File "\\hdrive\home$\u045\min\mydocuments\message\message_data\message_data\projects\ngfs\scenario_runner.py", line 475, in run
    self.solve(copy_demands=True,
  File "\\hdrive\home$\u045\min\mydocuments\message\message_data\message_data\projects\ngfs\scenario_runner.py", line 290, in solve
    combined_report(self.scen, key = variant)
  File "\\hdrive\home$\u045\min\mydocuments\message\message_data\message_data\projects\ngfs\util.py", line 914, in combined_report
    reporting(
  File "\\hdrive\home$\u045\min\mydocuments\message\message_data\message_data\tools\post_processing\iamc_report_hackathon.py", line 393, in report
    scen.commit("Reporting uploaded as timeseries")
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\ixmp\core\timeseries.py", line 199, in commit
    self._backend("commit", comment)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\ixmp\core\timeseries.py", line 108, in _backend
    return self.platform._backend(self, method, *args, **kwargs)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\ixmp\backend\base.py", line 53, in __call__
    return getattr(self, method)(obj, *args, **kwargs)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\ixmp\backend\jdbc.py", line 694, in commit
    _raise_jexception(e)
  File "C:\ProgramData\Anaconda3\envs\local_mix\lib\site-packages\ixmp\backend\jdbc.py", line 138, in _raise_jexception
    raise RuntimeError(msg) from None
RuntimeError: unhandled Java exception: There was a problem writing data to the IXMP database - no changes were saved!
Error writing the timeseries data to the IXMP database!

Versions

Output of ixmp show-versions ``` ixmp: 3.4.0 message_ix: 3.4.1.dev23+g5526f43 2045b07 (HEAD -> main, origin/main, origin/HEAD) Merge pull request #327 from behnam-zakeri/cleanup-macro message_ix_models: 2022.3.4.dev19+g6929fd6 2c97ab5 (HEAD -> main, origin/main, origin/HEAD) Merge pull request #60 from iiasa/feature/workflow message_data: 2020.6.21.dev1594+g1e32fcc1b.d20220316 5862a1532 (HEAD -> project/NGFS_phase3) Add material-specific tables click: 8.0.4 dask: 2022.02.1 genno: installed graphviz: 0.19.1 jpype: 1.3.0 … JVM path: C:\Program Files\Java\jdk1.8.0_202\jre\bin\server\jvm.dll openpyxl: 3.0.9 pandas: 1.4.1 pint: 0.17 xarray: 2022.3.0 yaml: 6.0 iam_units: installed jupyter: installed matplotlib: 3.5.1 plotnine: 0.8.0 pyam: 1.3.1 GAMS: 26.1.0 python: 3.9.10 | packaged by conda-forge | (main, Feb 1 2022, 21:21:54) [MSC v.1929 64 bit (AMD64)] python-bits: 64 OS: Windows OS-release: 10 machine: AMD64 processor: Intel64 Family 6 Model 94 Stepping 3, GenuineIntel byteorder: little LC_ALL: None LANG: None LOCALE: ('English_United States', '1252') ```
Jihoon commented 2 years ago

I will try to extract the data again, including the material variables.

Jihoon commented 2 years ago

Here is the data frame which I exported at the point it fails to upload. timeseries.csv

Jihoon commented 2 years ago

@OFR-IIASA discovered there are inf values in the data. Need to be debugged from the scenario design.

At the same time, @khaeru suggests "If ixmp should never store ± inf, we can put a check in the Python code that wraps the Java code and raise a more informative error message."

khaeru commented 2 years ago

As a mitigation, user code can currently do something like assert not df.applymap(np.isinf).any().any() to ensure there are no infinite values in the data eventually passed to .add_timeseries().

Per

If ixmp should never

…to be clear, the "if" is there because there was never a specific stated requirement for whether ixmp should store all valid IEEE 754 floating-point values, including NaN and ±inf, of the kind that Numpy and Pandas happily handle. If it should, then the JDBCBackend has a particular limitation that other backends should not imitate. If it should not, then the JDBCBackend is functioning properly and the issue is that the error message is uninformative.

phackstock commented 2 years ago

From taking a quick look at the oracle documentation (https://docs.oracle.com/cd/B28359_01/server.111/b28318/datatype.htm#CNCPT313), I found this:

The following numbers can be stored in a NUMBER column:

  • Positive numbers in the range 1 x 10-130 to 9.99...9 x 10125 with up to 38 significant digits
  • Negative numbers from -1 x 10-130 to 9.99...99 x 10125 with up to 38 significant digits
  • Zero
  • Positive and negative infinity (generated only by importing from an Oracle Database, Version 5)

@peterkolp can probably shed more light onto this which version of our Oracle Database is.

khaeru commented 2 years ago

@Jihoon confirms that this occurs only on Oracle, which makes it hard to test a fix; see further discussion in the PR.

peterkolp commented 2 years ago

@peterkolp can probably shed more light onto this which version of our Oracle Database is.

database version (on x8oda) is ORACLE 19. (in more detail: "Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0")

phackstock commented 2 years ago

Does that mean that it should accept infinity values?

khaeru commented 2 years ago

I interpret "(generated only by importing from an Oracle Database, Version 5)" to mean that they will appear only if the values were first stored using version 5, then imported to current version 19. This implies that the current version 19 cannot be made to store them directly.

this occurs only on Oracle, which makes it hard to test a fix; see further discussion in the PR.

Mentioned this previously at e.g. https://github.com/iiasa/ixmp/issues/215#issue-527123129. The problem is that we cannot¹ spin up an Oracle server on (a) a developer's local machine or (b) on GitHub Actions to first reproduce issues and then confirm that fixes work.

¹ not theoretically impossible, but beyond current resources.