Closed bradyrx closed 4 years ago
Perhaps this would be solved with the idea that there should be a preprocess=func(*args, **kwargs) feature for intake-esm. This was mentioned in the discussion at the NCAR hub but I'm not seeing a formal issue for it.
See #155
@bradyrx,
I did further investigation regarding this chunking issue, and I couldn't get xarray+dask to chunk data in uniform chunks:
import intake
col = intake.open_esm_datastore("/glade/collections/cmip/catalog/intake-esm-datastore/catalogs/glade-cmip6.json")
cat = col.search(experiment_id='historical',
activity_id='CMIP',
table_id='Omon',
variable_id='spco2',
grid_label='gn',
source_id='CESM2')
dsets = cat.to_dataset_dict(cdf_kwargs={"chunks": {"time": -1}}, aggregate=False)
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.member_id.table_id.variable_id.grid_label.version.time_range'
--> There will be 26 group(s)
for key, ds in dsets.items():
print(f"member_id={key.split('.')[4]} --> {ds.dims}")
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r1i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r2i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r3i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r4i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r5i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r6i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
I tried different chunking schemes without success:
for chunks in [-1, 11, 180, 200, 500, 800, 2000]:
dsets = cat.to_dataset_dict(cdf_kwargs={"chunks": {"time": chunks}})
_, ds = dsets.popitem()
print(f"specified_chunks={chunks} --> {ds.spco2.chunks}")
print("*"*80)
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=-1 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (600, 600, 600, 180), (384,), (320,))
********************************************************************************
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=11 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 6, 5, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 6, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4, 7, 4), (384,), (320,))
********************************************************************************
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=180 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (180, 180, 180, 60, 120, 60, 120, 60, 120, 60, 60, 60, 120, 60, 120, 60, 120, 60, 180), (384,), (320,))
********************************************************************************
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=200 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (200, 200, 200, 200, 200, 200, 200, 200, 200, 180), (384,), (320,))
********************************************************************************
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=500 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (500, 100, 400, 100, 100, 300, 200, 100, 180), (384,), (320,))
********************************************************************************
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=800 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (600, 200, 400, 400, 200, 180), (384,), (320,))
********************************************************************************
--> The keys in the returned dictionary of datasets are constructed as follows:
'activity_id.institution_id.source_id.experiment_id.table_id.grid_label'
--> There will be 1 group(s)
specified_chunks=2000 --> ((1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (600, 600, 600, 180), (384,), (320,))
********************************************************************************
In the future, we would like to enable automatic alignment of Dask chunksizes (but not the other way around). We might also require that all arrays in a dataset share the same chunking alignment. Neither of these are currently done.
I am not sure this note explains the behavior
Ccing @dcherian as he might know why xarray doesn't uniformly chunk the data.
Perhaps this would be solved with the idea that there should be a preprocess=func(*args, **kwargs)
When I tried using the preprocess
, I was getting this error:
def preprocess(ds):
return ds.chunk({"time": 180})
dsets = cat.to_dataset_dict(preprocess=preprocess)
_, ds = dsets.popitem()
print(ds.chunks)
ValueError Traceback (most recent call last)
<ipython-input-28-abfadfb85ab7> in <module>
1 _, ds = dsets.popitem()
----> 2 print(ds.chunks)
/glade/work/abanihi/devel/pangeo/xarray/xarray/core/dataset.py in chunks(self)
1671 if dim in chunks and c != chunks[dim]:
1672 raise ValueError(
-> 1673 f"Object has inconsistent chunks along dimension {dim}. "
1674 "This can be fixed by calling unify_chunks()."
1675 )
ValueError: Object has inconsistent chunks along dimension time. This can be fixed by calling unify_chunks().
```
So far, the only solution to this issue that I am aware of, involves re-chunking the data with `ds.rechunk()`
Are you reading files that contain different lengths of time?
The chunks
kwargs is passed down to open_dataset
so when you say "time": -1
, it means that (chunksize for each file) == (length of time in each file)
. You should rechunk afterward to get what you want.
It would be great if one of you could update the xarray documentation to make that clear. (see issue https://github.com/pydata/xarray/issues/1795)
Are you reading files that contain different lengths of time?
Yes. Here are the dimension sizes found in each file:
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r10i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r11i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r1i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r2i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r3i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r4i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r5i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r6i1p1f1 --> Frozen(SortedKeysDict({'time': 1980, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r7i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r8i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 600, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
member_id=r9i1p1f1 --> Frozen(SortedKeysDict({'time': 180, 'nlat': 384, 'nlon': 320, 'd2': 2, 'vertices': 4}))
It would be great if one of you could update the xarray documentation to make that clear. (see issue pydata/xarray#1795)
Sounds good. I will take a look in the coming days
Description
I am using kwargs to chunk my multiple file datasets as they come in as in the
intake-esm
demos:dsets = cat.to_dataset_dict(cdf_kwargs={"chunks": {"time": -1}})
, but after they are concatenated and loaded into a dictionary, the chunking doesn't come through as expected.Note the chunk size on
spco2
for time is 600 and not 1980. I think this is an issue I've encountered withopen_mfdataset
before in the past, so maybe this is not anintake-esm
issue.*Perhaps this would be solved with the idea that there should be a `preprocess=func(args, kwargs)
feature for
intake-esm`. This was mentioned in the discussion at the NCAR hub but I'm not seeing a formal issue for it.Output of
intake_esm.__version__
'0.0.post698'
Output of
conda list
packages in environment at /glade/u/home/rbrady/miniconda3/envs/analysis:
#
Name Version Build Channel
_libgcc_mutex 0.1 main
affine 2.2.2 py_0 conda-forge aiohttp 3.6.1 py36h516909a_0 conda-forge ansiwrap 0.8.4 py_0 conda-forge appdirs 1.4.3 py_1 conda-forge asciitree 0.3.3 py_2 conda-forge asn1crypto 0.24.0 py36_1003 conda-forge async-timeout 3.0.1 py_1000 conda-forge atomicwrites 1.3.0 py_0 conda-forge attrs 19.1.0 py_0 conda-forge backcall 0.1.0 py_0 conda-forge backports 1.0 py_2 conda-forge backports.tempfile 1.0 py_0 conda-forge backports.weakref 1.0.post1 py36_1000 conda-forge basemap 1.2.1 py36hd759880_1 conda-forge black 19.3b0 py_0 conda-forge blas 2.12 openblas conda-forge bleach 3.1.0 py_0 conda-forge blosc 1.17.0 he1b5a44_0 conda-forge bokeh 1.3.4 py36_0 conda-forge boost-cpp 1.68.0 h11c811c_1000 conda-forge boto3 1.9.238 pypi_0 pypi botocore 1.12.238 pypi_0 pypi bottleneck 1.2.1 py36h3010b51_1001 conda-forge bzip2 1.0.8 h516909a_0 conda-forge ca-certificates 2019.9.11 hecc5488_0 conda-forge cached-property 1.5.1 py_0 conda-forge cachetools 3.1.1 pypi_0 pypi cairo 1.16.0 h18b612c_1001 conda-forge cartopy 0.17.0 py36hd759880_1006 conda-forge certifi 2019.9.11 py36_0 conda-forge cf_units 2.0.1 py36h3010b51_1002 conda-forge cffi 1.12.3 py36h8022711_0 conda-forge cfitsio 3.470 hb60a0a2_2 conda-forge cftime 1.0.3.4 py36hd352d35_1001 conda-forge chardet 3.0.4 py36_1003 conda-forge click 7.0 py_0 conda-forge click-plugins 1.1.1 py_0 conda-forge cligj 0.5.0 py_0 conda-forge climpred 1.1.0 py_1 conda-forge cloudpickle 1.2.1 py_0 conda-forge cmocean 2.0 py_1 conda-forge colorcet 2.0.1 py_0 conda-forge colorspacious 1.1.2 pyh24bf2e0_0 conda-forge cryptography 2.7 py36h72c5cf5_0 conda-forge curl 7.65.3 hf8cf82a_0 conda-forge cycler 0.10.0 py_1 conda-forge cython 0.29.13 py36he1b5a44_0 conda-forge cytoolz 0.10.0 py36h516909a_0 conda-forge dask 2.4.0 py_0 conda-forge dask-core 2.4.0 py_0 conda-forge dask-glm 0.2.0 pypi_0 pypi dask-jobqueue 0.6.3 py_0 conda-forge dask-ml 1.0.0 pypi_0 pypi dask-mpi 1.0.3 py36_0 conda-forge datashader 0.7.0 py_0 conda-forge datashape 0.5.4 py_1 conda-forge dbus 1.13.6 he372182_0 conda-forge decorator 4.4.0 py_0 conda-forge defusedxml 0.5.0 py_1 conda-forge distributed 2.4.0 py_0 conda-forge docrep 0.2.7 py_0 conda-forge docutils 0.15.2 py36_0 conda-forge entrypoints 0.3 py36_1000 conda-forge eofs 1.4.0 py_0 conda-forge esmf 7.1.0 h9eb252b_1005 conda-forge esmlab 2019.4.27 py_0 conda-forge esmpy 7.1.0 py36h24bf2e0_3 conda-forge esmtools 1.1.0 pypi_0 pypi expat 2.2.5 he1b5a44_1003 conda-forge fasteners 0.14.1 py_3 conda-forge fastparquet 0.3.2 py36hc1659b7_0 conda-forge fiona 1.8.6 py36hf242f0b_3 conda-forge flake8 3.7.8 py36_1 conda-forge fontconfig 2.13.1 he4413a7_1000 conda-forge freetype 2.10.0 he983fc9_1 conda-forge freexl 1.0.5 h14c3975_1002 conda-forge fribidi 1.0.5 h516909a_1002 conda-forge fsspec 0.4.1 py_0 conda-forge future 0.17.1 py36_1000 conda-forge g2clib 1.6.0 hf3f1b0b_9 conda-forge gcsfs 0.3.1 pypi_0 pypi gdal 2.4.1 py36h5f563d9_10 conda-forge geopandas 0.5.1 py_0 conda-forge geos 3.7.2 he1b5a44_1 conda-forge geotiff 1.4.3 hb6868eb_1001 conda-forge geoviews 1.6.3 py_0 conda-forge geoviews-core 1.6.3 py_0 conda-forge gettext 0.19.8.1 hc5be6a0_1002 conda-forge giflib 5.1.7 h516909a_1 conda-forge git 2.22.0 pl526hce37bd2_0 conda-forge glib 2.58.3 h6f030ca_1002 conda-forge google-auth 1.6.3 pypi_0 pypi google-auth-oauthlib 0.4.1 pypi_0 pypi graphite2 1.3.13 hf484d3e_1000 conda-forge graphviz 2.40.1 h5933667_1 conda-forge gst-plugins-base 1.14.5 h0935bb2_0 conda-forge gstreamer 1.14.5 h36ae1b5_0 conda-forge gsw 3.3.1 py36h516909a_0 conda-forge h5netcdf 0.7.4 py_0 conda-forge h5py 2.9.0 nompi_py36hcafd542_1103 conda-forge harfbuzz 2.4.0 h37c48d4_1 conda-forge hdf4 4.2.13 h9a582f1_1002 conda-forge hdf5 1.10.4 nompi_h3c11f04_1106 conda-forge hdfeos2 2.20 h64bfcee_1000 conda-forge hdfeos5 5.1.16 hccfc538_4 conda-forge heapdict 1.0.0 py36_1000 conda-forge holoviews 1.12.5 py_0 conda-forge hvplot 0.4.0 py_1 conda-forge icu 58.2 hf484d3e_1000 conda-forge idna 2.8 py36_1000 conda-forge idna_ssl 1.1.0 py36_1000 conda-forge imageio 2.5.0 py36_0 conda-forge importlib_metadata 0.23 py36_0 conda-forge intake 0.5.3 py_0 conda-forge intake-esm 0.0.post698 pypi_0 pypi intake-xarray 0.3.1 py_0 conda-forge ipykernel 5.1.2 py36h5ca1d4c_0 conda-forge ipython 7.8.0 py36h5ca1d4c_0 conda-forge ipython_genutils 0.2.0 py_1 conda-forge ipywidgets 7.5.1 py_0 conda-forge isort 4.3.21 py36_0 conda-forge jasper 1.900.1 h07fcdf6_1006 conda-forge jedi 0.15.1 py36_0 conda-forge jinja2 2.10.1 py_0 conda-forge jmespath 0.9.4 py_0 conda-forge joblib 0.13.2 py_0 conda-forge jpeg 9c h14c3975_1001 conda-forge json-c 0.13.1 h14c3975_1001 conda-forge json5 0.8.5 py_0 conda-forge jsonschema 3.0.2 py36_0 conda-forge jupyter 1.0.0 py_2 conda-forge jupyter-server-proxy 1.1.0 py_0 conda-forge jupyter_client 5.3.1 py_0 conda-forge jupyter_console 6.0.0 py_0 conda-forge jupyter_core 4.4.0 py_0 conda-forge jupyterlab 1.1.4 py_0 conda-forge jupyterlab_server 1.0.5 py_0 conda-forge kealib 1.4.10 h1978553_1003 conda-forge kiwisolver 1.1.0 py36hc9558a2_0 conda-forge krb5 1.16.3 h05b26f9_1001 conda-forge libblas 3.8.0 12_openblas conda-forge libcblas 3.8.0 12_openblas conda-forge libcurl 7.65.3 hda55be3_0 conda-forge libdap4 3.20.2 hd48c02d_1000 conda-forge libedit 3.1.20170329 hf8c457e_1001 conda-forge libffi 3.2.1 he1b5a44_1006 conda-forge libgcc-ng 9.1.0 hdf63c60_0
libgdal 2.4.1 hc4f5fd6_10 conda-forge libgfortran-ng 7.3.0 hdf63c60_0
libiconv 1.15 h516909a_1005 conda-forge libkml 1.3.0 h328b03d_1009 conda-forge liblapack 3.8.0 12_openblas conda-forge liblapacke 3.8.0 12_openblas conda-forge libnetcdf 4.6.2 hbdf4f91_1001 conda-forge libopenblas 0.3.7 h6e990d7_1 conda-forge libpng 1.6.37 hed695b0_0 conda-forge libpq 11.5 hd9ab2ff_1 conda-forge libsodium 1.0.17 h516909a_0 conda-forge libspatialindex 1.9.0 he1b5a44_1 conda-forge libspatialite 4.3.0a h79dc798_1030 conda-forge libssh2 1.8.2 h22169c7_2 conda-forge libstdcxx-ng 9.1.0 hdf63c60_0
libtiff 4.0.10 h57b8799_1003 conda-forge libtool 2.4.6 h14c3975_1002 conda-forge libuuid 2.32.1 h14c3975_1000 conda-forge libxcb 1.13 h14c3975_1002 conda-forge libxml2 2.9.9 h13577e0_2 conda-forge llvmlite 0.29.0 py36hfd453ef_1 conda-forge locket 0.2.0 py_2 conda-forge lxml 4.4.1 pypi_0 pypi lz4 2.2.1 py36hd79334b_0 conda-forge lz4-c 1.8.3 he1b5a44_1001 conda-forge markdown 3.1.1 py_0 conda-forge markupsafe 1.1.1 py36h14c3975_0 conda-forge matplotlib 3.1.1 py36_0 conda-forge matplotlib-base 3.1.1 py36hfd891ef_0 conda-forge mccabe 0.6.1 py_1 conda-forge memory_profiler 0.55.0 py_0 conda-forge metpy 0.10.2 py36_0 conda-forge mistune 0.8.4 py36h14c3975_1000 conda-forge monotonic 1.5 py_0 conda-forge more-itertools 7.2.0 py_0 conda-forge mpi 1.0 mpich conda-forge mpi4py 3.0.2 py36hcf07815_0 conda-forge mpich 3.2.1 hc99cbb1_1014 conda-forge msgpack-numpy 0.4.4.3 py_0 conda-forge msgpack-python 0.6.2 py36hc9558a2_0 conda-forge multidict 4.5.2 py36h14c3975_1000 conda-forge multipledispatch 0.6.0 py_0 conda-forge munch 2.3.2 py_0 conda-forge nbconvert 5.6.0 py_0 conda-forge nbformat 4.4.0 py_1 conda-forge nc-time-axis 1.2.0 py_0 conda-forge ncar-jobqueue 2019.9.11 pypi_0 pypi ncurses 6.1 hf484d3e_1002 conda-forge netcdf-fortran 4.4.5 hea25ff8_1000 conda-forge netcdf4 1.5.1.2 py36had58050_0 conda-forge networkx 2.3 py_0 conda-forge nodejs 12.4.0 he1b5a44_0 conda-forge nomkl 3.0 0
notebook 6.0.1 py36_0 conda-forge numba 0.45.1 py36hb3f55d8_0 conda-forge numcodecs 0.6.3 py36hf484d3e_0 conda-forge numpy 1.17.2 py36h95a1406_0 conda-forge oauthlib 3.1.0 pypi_0 pypi olefile 0.46 py_0 conda-forge openjpeg 2.3.1 h58a6597_0 conda-forge openssl 1.1.1c h516909a_0 conda-forge owslib 0.18.0 py_0 conda-forge packaging 19.0 py_0 conda-forge pandas 0.25.1 py36hb3f55d8_0 conda-forge pandoc 2.7.3 0 conda-forge pandocfilters 1.4.2 py_1 conda-forge panel 0.6.2 0 conda-forge pango 1.42.4 ha030887_1 conda-forge papermill 1.2.0 py36_0 conda-forge param 1.9.1 py_0 conda-forge parso 0.5.1 py_0 conda-forge partd 1.0.0 py_0 conda-forge patsy 0.5.1 py_0 conda-forge pcre 8.41 hf484d3e_1003 conda-forge perl 5.26.2 h516909a_1006 conda-forge pexpect 4.7.0 py36_0 conda-forge pickleshare 0.7.5 py36_1000 conda-forge pillow 6.1.0 py36h6b7be26_1 conda-forge pint 0.8.1 py_1 conda-forge pip 19.2.3 py36_0 conda-forge pixman 0.38.0 h516909a_1003 conda-forge pluggy 0.12.0 py_0 conda-forge pooch 0.5.2 py36_0 conda-forge pop-tools 0.0.post61 pypi_0 pypi poppler 0.67.0 ha967d66_7 conda-forge poppler-data 0.4.9 1 conda-forge postgresql 11.5 hc63931a_1 conda-forge progress-bar 8 pypi_0 pypi proj4 5.2.0 he1b5a44_1005 conda-forge prometheus_client 0.7.1 py_0 conda-forge prompt_toolkit 2.0.9 py_0 conda-forge properscoring 0.1 py_0 conda-forge proplot 1.0 pypi_0 pypi psutil 5.6.3 py36h516909a_0 conda-forge pthread-stubs 0.4 h14c3975_1001 conda-forge ptyprocess 0.6.0 py_1001 conda-forge py 1.8.0 py_0 conda-forge pyasn1 0.4.7 pypi_0 pypi pyasn1-modules 0.2.7 pypi_0 pypi pycodestyle 2.5.0 py_0 conda-forge pycparser 2.19 py36_1 conda-forge pyct 0.4.6 py_0 conda-forge pyct-core 0.4.6 py_0 conda-forge pyepsg 0.4.0 py_0 conda-forge pyflakes 2.1.1 py_0 conda-forge pygments 2.4.2 py_0 conda-forge pykdtree 1.3.1 py36h3010b51_1002 conda-forge pynio 1.5.5 py36h8b983ae_0 conda-forge pyopenssl 19.0.0 py36_0 conda-forge pyparsing 2.4.2 py_0 conda-forge pyproj 1.9.6 py36h516909a_1002 conda-forge pyqt 5.9.2 py36hcca6a23_4 conda-forge pyrsistent 0.15.4 py36h516909a_0 conda-forge pyshp 2.1.0 py_0 conda-forge pysocks 1.7.1 py36_0 conda-forge pytest 5.0.1 py36_0 conda-forge python 3.6.7 h357f687_1005 conda-forge python-blosc 1.8.1 py36hf484d3e_0 conda-forge python-dateutil 2.8.0 py_0 conda-forge python-graphviz 0.11.1 py_1 conda-forge python-snappy 0.5.4 py36hee44bf9_0 conda-forge pytz 2019.2 py_0 conda-forge pyviz_comms 0.7.2 py_0 conda-forge pywavelets 1.0.3 py36hd352d35_1 conda-forge pyyaml 5.1.2 py36h516909a_0 conda-forge pyzmq 18.1.0 py36h1768529_0 conda-forge qt 5.9.7 h52cfd70_2 conda-forge qtconsole 4.5.3 py_0 conda-forge rasterio 1.0.25 py36hdff7cfa_0 conda-forge readline 8.0 hf8c457e_0 conda-forge requests 2.22.0 py36_1 conda-forge requests-oauthlib 1.2.0 pypi_0 pypi rsa 4.0 pypi_0 pypi rtree 0.8.3 py36h666c49c_1002 conda-forge ruamel 1.0 py36_0 conda-forge ruamel.yaml 0.16.5 py36h516909a_1 conda-forge ruamel.yaml.clib 0.1.2 py36h516909a_0 conda-forge s3fs 0.3.4 pypi_0 pypi s3transfer 0.2.1 py36_0 conda-forge salem 0.2.4 pypi_0 pypi scikit-image 0.15.0 py36hb3f55d8_2 conda-forge scikit-learn 0.21.3 py36hcdab131_0 conda-forge scipy 1.3.1 py36h921218d_2 conda-forge seaborn 0.9.0 py_1 conda-forge send2trash 1.5.0 py_0 conda-forge setuptools 41.2.0 py36_0 conda-forge shapely 1.6.4 py36hec07ddf_1006 conda-forge simpervisor 0.3 py_1 conda-forge sip 4.19.8 py36hf484d3e_1000 conda-forge six 1.12.0 py36_1000 conda-forge sklearn-xarray 0.3.0 pypi_0 pypi snakeviz 2.0.0 py_0 conda-forge snappy 1.1.7 he1b5a44_1002 conda-forge snuggs 1.4.6 py_0 conda-forge sortedcontainers 2.1.0 py_0 conda-forge sqlite 3.29.0 hcee41ef_0 conda-forge statsmodels 0.10.1 py36hc1659b7_0 conda-forge tabulate 0.8.3 py_0 conda-forge tblib 1.4.0 py_0 conda-forge tenacity 5.1.1 py36_0 conda-forge terminado 0.8.2 py36_0 conda-forge testpath 0.4.2 py_1001 conda-forge textwrap3 0.9.2 py_0 conda-forge thrift 0.11.0 py36hf484d3e_1001 conda-forge tk 8.6.9 hed695b0_1002 conda-forge toml 0.10.0 py_0 conda-forge toolz 0.10.0 py_0 conda-forge tornado 5.1.1 py36h14c3975_1000 conda-forge tqdm 4.34.0 py_0 conda-forge traitlets 4.3.2 py36_1000 conda-forge typing_extensions 3.7.4 py36_0 conda-forge tzcode 2019a h516909a_1002 conda-forge udunits2 2.2.27.6 h4e0c4b3_1001 conda-forge urllib3 1.25.6 py36_0 conda-forge viscm 0.7 pyh24bf2e0_0 conda-forge wcwidth 0.1.7 py_1 conda-forge webencodings 0.5.1 py_1 conda-forge wheel 0.33.6 py36_0 conda-forge widgetsnbextension 3.5.1 py36_0 conda-forge xarray 0.14.0 pypi_0 pypi xerces-c 3.2.2 hea5cb30_1003 conda-forge xesmf 0.2.0 py36_0 conda-forge xgcm 0.2.0 py_0 conda-forge xhistogram 0.1.1 py_0 conda-forge xorg-kbproto 1.0.7 h14c3975_1002 conda-forge xorg-libice 1.0.10 h516909a_0 conda-forge xorg-libsm 1.2.3 h84519dc_1000 conda-forge xorg-libx11 1.6.8 h516909a_0 conda-forge xorg-libxau 1.0.9 h14c3975_0 conda-forge xorg-libxdmcp 1.1.3 h516909a_0 conda-forge xorg-libxext 1.3.4 h516909a_0 conda-forge xorg-libxpm 3.5.12 h14c3975_1002 conda-forge xorg-libxrender 0.9.10 h516909a_1002 conda-forge xorg-libxt 1.1.5 h516909a_1003 conda-forge xorg-renderproto 0.11.1 h14c3975_1002 conda-forge xorg-xextproto 7.3.0 h14c3975_1002 conda-forge xorg-xproto 7.0.31 h14c3975_1007 conda-forge xrft 0.2.0 py_0 conda-forge xskillscore 0.0.7 py_0 conda-forge xz 5.2.4 h14c3975_1001 conda-forge yaml 0.1.7 h14c3975_1001 conda-forge yarl 1.3.0 py36h14c3975_1000 conda-forge zarr 2.3.2 py36_0 conda-forge zeromq 4.3.2 he1b5a44_2 conda-forge zict 1.0.0 py_0 conda-forge zipp 0.5.2 py_0 conda-forge zlib 1.2.11 h516909a_1005 conda-forge zstd 1.4.0 h3b9ef0a_0 conda-forge