DARMA-tasking / LB-analysis-framework

Analysis framework for exploring, testing, and comparing load balancing strategies
Other
3 stars 1 forks source link

Existence of additional file(s) in data directory causes JSON reader to crash due to incorrect n_ranks calculation #403

Closed ppebay closed 1 year ago

ppebay commented 1 year ago

Example with user-defined-memory-toy-problem.yaml case:

[pppebay@localhost]~/Documents/Git/LB-analysis-framework/src/lbaf/Applications$ python LBAF_app.py -c user-defined-memory-toy-problem.yaml
[LBAF_app] Found configuration file at path /Users/pppebay/Documents/Git/LB-analysis-framework/config/user-defined-memory-toy-problem.yaml
[LBAF_app] Logging level: info
[lbsConfigurationValidator] Skeleton schema is valid
[lbsConfigurationValidator] Reading from data was chosen
[lbsConfigurationValidator] from_data schema is valid
[lbsConfigurationValidator] Checking algorithm schema of: {'name': 'InformAndTransfer', 'phase_id': 0, 'parameters': {'n_iterations': 4, 'n_rounds': 2, 'fanout': 2, 'order_strategy': 'arbitrary', 'transfer_strategy': 'Clustering', 'criterion': 'Tempered', 'max_objects_per_transfer': 32, 'deterministic_transfer': False}}
[lbsConfigurationValidator] Algorithm: {'name': 'InformAndTransfer', 'phase_id': 0, 'parameters': {'n_iterations': 4, 'n_rounds': 2, 'fanout': 2, 'order_strategy': 'arbitrary', 'transfer_strategy': 'Clustering', 'criterion': 'Tempered', 'max_objects_per_transfer': 32, 'deterministic_transfer': False}} schema is valid
[LBAF_app] Data stem: /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem
[LBAF_app] Executing LBAF version 0.1.0rc1
[LBAF_app] Executing with Python 3.8.16
[JSON_data_files_validator_loader] Retrieve the JSON data files validator at https://raw.githubusercontent.com/DARMA-tasking/vt/develop/scripts/JSON_data_files_validator.py
[JSON_data_files_validator_loader] Saved JSON data files validator to: /Users/pppebay/Documents/Git/LB-analysis-framework/src/lbaf/imported/JSON_data_files_validator.py
[lbsVTDataReader] Reading /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem.0.json
[lbsVTDataReader] Reading /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem.1.json
[lbsVTDataReader] Reading /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem.2.json
[lbsVTDataReader] Reading /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem.3.json
[lbsVTDataReader] Reading /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem.4.json
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "/Users/pppebay/Documents/Git/LB-analysis-framework/src/lbaf/IO/lbsVTDataReader.py", line 110, in _load_vt_file
    raise FileNotFoundError(f"File {file_name} not found")
FileNotFoundError: File /Users/pppebay/Documents/Git/LB-analysis-framework/data/user-defined-memory-toy-problem/toy_mem.4.json not found
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "LBAF_app.py", line 510, in <module>
    Application().run()
  File "LBAF_app.py", line 323, in run
    reader = LoadReader(
  File "/Users/pppebay/Documents/Git/LB-analysis-framework/src/lbaf/IO/lbsVTDataReader.py", line 66, in __init__
    for rank, decompressed_dict in results:
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/pool.py", line 868, in next
    raise value