DTOcean / dtocean

Design and techno-economic assessment of marine renewable energy arrays
https://dtocean.github.io
GNU General Public License v3.0
4 stars 6 forks source link

MemoryError: Tidal time series from database not being populated #29

Closed NajiyaN closed 5 years ago

NajiyaN commented 5 years ago

Hi,

I am working with DTO2.0 and the available example database. I noticed that tidal time series data within the database is not showing up in the hydrodynamic module following the initiate data flow step(tidal time series is a red square). I tried this with two different databases both having information on tidal series. Please advice on what could be the possible reasons.

tidal_error

Thanks

H0R5E commented 5 years ago

Hi, if you scroll up in the system window, there will be an error logged in there that passed silently. Can you post the full stack trace here?

For the tidal time series table the rule is that it must have an equal number of time steps for every point in the bathymetry table (for your particular site). You can check the lengths of matching tables in the filter schema of the database (filter.bathymetry and filter.time_series_energy_tidal) using the "Count rows" command. Then check if they are a multiple of each other.

One issue might be that the values in the fk_bathymetry_id column do not match with the id column of the bathymetry table.

NajiyaN commented 5 years ago

Hi , Thanks for getting back. The full trace is pasted below.

2019-06-19 14:56:24,864 - INFO - dtocean_core.core - Data added for identifier 'device.system_type' 2019-06-19 14:56:41,160 - INFO - dtocean_core.core - Data added for identifier 'hidden.pipeline_active' 2019-06-19 14:56:45,733 - INFO - dtocean_core.core - Data added for identifiers: hidden.landing_points hidden.site_boundaries hidden.available_systems hidden.corridor_boundaries hidden.available_sites hidden.lease_boundaries 2019-06-19 14:56:45,792 - INFO - dtocean_core.core - Data added for identifiers: device.available_names site.available_names hidden.corridor_selected hidden.lease_selected 2019-06-19 14:56:54,171 - INFO - dtocean_core.core - Data added for identifier 'site.selected_name' 2019-06-19 14:56:59,948 - INFO - dtocean_core.core - Data added for identifier 'device.selected_name' 2019-06-19 14:57:03,996 - INFO - dtocean_core.core - Data added for identifiers: site.corridor_boundary site.lease_boundary site.projection hidden.site_boundary corridor.landing_point hidden.corridor_selected hidden.lease_selected 2019-06-19 14:59:02,736 - INFO - dtocean_core.core - Data added for identifiers: hidden.site_filtered hidden.device_filtered 2019-06-19 14:59:02,776 - INFO - dtocean_core.core - Data added for identifier 'hidden.dataflow_active' 2019-06-19 15:02:45,834 - ERROR - root - Reading variables farm.tidal_series generated error: <type 'exceptions.MemoryError'> Traceback (most recent call last): File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\pipeline.py", line 1245, in _read_variables fetch_var_ids) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\pipeline.py", line 1146, in _get_read_values interface = core.connect_interface(project, interface) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_app\core.py", line 297, in connect_interface interface = super(GUICore, self).connect_interface(project, interface) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\core.py", line 1503, in connect_interface interface.safe_connect() File "C:\Users\Safraj\DTOcean\lib\site-packages\aneris\boundary\interface.py", line 748, in safe_connect self.connect() File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\interfaces\bathymetry.py", line 463, in connect raw_strata = tidal_series_records_to_xset(result) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\utils\database.py", line 183, in tidal_series_records_to_xset 'ssh']) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 1269, in from_records coerce_float=coerce_float) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 7475, in _to_arrays dtype=dtype) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 7549, in _list_to_arrays content = list(lib.to_object_array_tuples(data).T) File "pandas/_libs/src\inference.pyx", line 1563, in pandas._libs.lib.to_object_array_tuples MemoryError 2019-06-19 15:02:53,012 - INFO - dtocean_core.core - Data added for identifiers: constants.cylinder_wake_amplificiation farm.max_gamma_100_year farm.mean_wind_speed_100_year device.system_profile farm.max_gust_wind_direction_100_year device.bidirection constants.rectangular_current_drag device.installation_depth_min constants.gravity device.dry_beam_area device.wet_beam_area device.system_height device.foundation_type constants.pile_Bm_moment_coefficient farm.mean_wind_direction_100_year farm.tidal_occurrence_point device.system_displaced_volume constants.soilprops constants.air_density farm.max_tp_100_year device.turbine_diameter device.system_centre_of_gravity device.minimum_distance_x device.minimum_distance_y constants.concrete_density device.cut_out_velocity constants.soil_drained_holding_capacity_factor constants.rectangular_drift constants.rectangular_wave_inertia farm.direction_of_max_surface_current device.system_mass farm.wave_direction_100_year component.foundations_pile device.foundation_location farm.min_water_level_50_year constants.pile_skin_friction_end_bearing_capacity farm.max_gust_wind_speed_100_year device.system_width component.foundations_anchor bathymetry.mannings component.foundations_anchor_soft device.turbine_performance device.power_rating device.dry_frontal_area farm.max_surface_current_10_year constants.pile_Am_moment_coefficient constants.soil_cohesive_reaction_coefficient device.turbine_hub_height component.foundations_anchor_sand farm.nogo_areas constants.grout_density device.cut_in_velocity device.yaw farm.max_hs_100_year constants.pile_deflection_coefficients farm.current_profile farm.blockage_ratio constants.steel_density device.system_roughness device.prescribed_footprint_radius bathymetry.layers constants.sea_water_density constants.cylinder_drag device.wet_frontal_area constants.line_bearing_capacity_factor constants.grout_compressive_strength farm.max_water_level_50_year constants.rectangular_wind_drag constants.soil_cohesionless_reaction_coefficient device.installation_depth_max device.system_length

I think for the database that came with DTO 2 few grid points have time steps less than others(as for the first bathymetry id) as the no of entries in tidal time series in not a multiple of bathymetry.

The second database showed an error when it was created by reading in values on an empty DB template. This was probably related to tidal time series . I am attaching it herewith . Appreciate if you could have a look. error_newDB

Thanks

H0R5E commented 5 years ago

I think for the database that came with DTO 2 few grid points have time steps less than others(as for the first bathymetry id) as the no of entries in tidal time series in not a multiple of bathymetry.

Remember that the example database contains 2 examples (one for wave and one for tidal) and only one tidal time series, so this is why you look at the counts in the filter schema of the database. For the tidal example there are 27889 grid points and 4685352 entries in the tidal_series_energy_tidal table, which makes 168 time steps per grid point, exactly.

So, as I said, it's important that you have a fixed number of time steps for every grid point in your bathymetry for the tidal_series_energy_tidal table. However, your immediate problem seems to be with your grid size. See here in the log is the error stack trace:

2019-06-19 15:02:45,834 - ERROR - root - Reading variables farm.tidal_series generated error: <type 'exceptions.MemoryError'> Traceback (most recent call last): File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\pipeline.py", line 1245, in _read_variables fetch_var_ids) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\pipeline.py", line 1146, in _get_read_values interface = core.connect_interface(project, interface) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_app\core.py", line 297, in connect_interface interface = super(GUICore, self).connect_interface(project, interface) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\core.py", line 1503, in connect_interface interface.safe_connect() File "C:\Users\Safraj\DTOcean\lib\site-packages\aneris\boundary\interface.py", line 748, in safe_connect self.connect() File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\interfaces\bathymetry.py", line 463, in connect raw_strata = tidal_series_records_to_xset(result) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\utils\database.py", line 183, in tidal_series_records_to_xset 'ssh']) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 1269, in from_records coerce_float=coerce_float) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 7475, in _to_arrays dtype=dtype) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 7549, in _list_to_arrays content = list(lib.to_object_array_tuples(data).T) File "pandas/_libs/src\inference.pyx", line 1563, in pandas._libs.lib.to_object_array_tuples MemoryError

The MemoryError indicates you have run out of memory. How many grid points & time steps do you have? How much RAM do you have? The recommended upper limit for a single site definition in the tidal_series_energy_tidal table is 5x10^{6} rows as stated in the manual.

NajiyaN commented 5 years ago

I think for the database that came with DTO 2 few grid points have time steps less than others(as for the first bathymetry id) as the no of entries in tidal time series in not a multiple of bathymetry.

Remember that the example database contains 2 examples (one for wave and one for tidal) and only one tidal time series, so this is why you look at the counts in the filter schema of the database. For the tidal example there are 27889 grid points and 4685352 entries in the tidal_series_energy_tidal table, which makes 168 time steps per grid point, exactly. Thanks for pointing this out as I was looking at the project schema of the database.

So, as I said, it's important that you have a fixed number of time steps for every grid point in your bathymetry for the tidal_series_energy_tidal table. However, your immediate problem seems to be with your grid size. See here in the log is the error stack trace:

2019-06-19 15:02:45,834 - ERROR - root - Reading variables farm.tidal_series generated error: <type 'exceptions.MemoryError'> Traceback (most recent call last): File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\pipeline.py", line 1245, in _read_variables fetch_var_ids) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\pipeline.py", line 1146, in _get_read_values interface = core.connect_interface(project, interface) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_app\core.py", line 297, in connect_interface interface = super(GUICore, self).connect_interface(project, interface) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\core.py", line 1503, in connect_interface interface.safe_connect() File "C:\Users\Safraj\DTOcean\lib\site-packages\aneris\boundary\interface.py", line 748, in safe_connect self.connect() File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\interfaces\bathymetry.py", line 463, in connect raw_strata = tidal_series_records_to_xset(result) File "C:\Users\Safraj\DTOcean\lib\site-packages\dtocean_core\utils\database.py", line 183, in tidal_series_records_to_xset 'ssh']) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 1269, in from_records coerce_float=coerce_float) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 7475, in _to_arrays dtype=dtype) File "C:\Users\Safraj\DTOcean\lib\site-packages\pandas\core\frame.py", line 7549, in _list_to_arrays content = list(lib.to_object_array_tuples(data).T) File "pandas/_libs/src\inference.pyx", line 1563, in pandas._libs.lib.to_object_array_tuples MemoryError

The MemoryError indicates you have run out of memory. How many grid points & time steps do you have? How much RAM do you have? The recommended upper limit for a single site definition in the tidal_series_energy_tidal table is 5x10^{6} rows as stated in the manual.

My system RAM is only 4 GB.

H0R5E commented 5 years ago

What you can achieve with 4GB RAM is more limited. The recommended minimum is 8GB.

Strategies for dealing with this would be either to select a smaller lease area when filtering the database or increase the spacing between your grid nodes.

stale[bot] commented 5 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.