frictionlessdata / datapackage-pipelines

Framework for processing data packages in pipelines of modular components.
https://frictionlessdata.io/
MIT License
119 stars 32 forks source link

Unable to run a YAML pipeline where the resources are zipped datapackage files #117

Open norahw opened 6 years ago

norahw commented 6 years ago

Hi there

When attempting to stream zipped datapackage files as resources, I get an error stating the temp directory does not exist: ERROR log from processor stream_remote_resources: +-------- | Traceback (most recent call last): | File "/anaconda/lib/python3.6/site-packages/datapackage_pipelines/specs/../lib/stream_remote_resources.py", line 212, in | rows = stream_reader(resource, url, ignore_missing or url == "") | File "/anaconda/lib/python3.6/site-packages/datapackage_pipelines/specs/../lib/stream_remote_resources.py", line 169, in stream_reader | schema, headers, stream, close = get_opener(url, _resource)() | File "/anaconda/lib/python3.6/site-packages/datapackage_pipelines/specs/../lib/stream_remote_resources.py", line 154, in opener | _stream.open() | File "/anaconda/lib/python3.6/site-packages/tabulator/stream.py", line 169, in open | source = tempfile.NamedTemporaryFile(suffix='.' + name) | File "/anaconda/lib/python3.6/tempfile.py", line 549, in NamedTemporaryFile | (fd, name) = _mkstemp_inner(dir, prefix, suffix, flags, output_type) | File "/anaconda/lib/python3.6/tempfile.py", line 260, in _mkstemp_inner | fd = _os.open(file, flags, 0o600) | FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/hv/8kj9pgt513q6kx2h29yyrrfr0000gn/T/tmprg7n15qi.data/55e4d040-1491-4ef1-9d3d-66b05fae4280.csv' +--------

Thanks a lot

akariv commented 6 years ago

Thanks for reporting @norahw - will try to reproduce.