dask / dask-ec2

Start a cluster in EC2 for dask.distributed
106 stars 37 forks source link

cannot import name 's3' #41

Closed raymondchua closed 7 years ago

raymondchua commented 7 years ago

Hi, I manage to run the dask-ec2 up command and manage to ssh into the head node. I start the ipython and execute the following the command, "from distributed import Client, s3, progress" and got a cannon import name 's3' error. Here is the full stack of error.

from distributed import Executor, s3, progress

ImportError Traceback (most recent call last)

in () ----> 1 from distributed import Executor, s3, progress ImportError: cannot import name 's3'
mrocklin commented 7 years ago

That syntax is old. Instead use a prefix like the following:

import dask.dataframe as dd
df = dd.read_csv('s3://bucket-name/myfiles.*.csv')

http://dask.pydata.org/en/latest/bytes.html

raymondchua commented 7 years ago

@mrocklin , so you are saying that the information on the dask-ec2 documentation is incorrect?

screenshot from 2017-01-04 17-48-51

mrocklin commented 7 years ago

Yes, that is old. Can I interest you in submitting a PR to update this documentation?

On Wed, Jan 4, 2017 at 8:49 AM, Raymond Chua notifications@github.com wrote:

So you are saying that the information on the dask-ec2 documentation is incorrect?

[image: screenshot from 2017-01-04 17-48-51] https://cloud.githubusercontent.com/assets/4645110/21650706/242bb986-d2a6-11e6-83a4-215632154ef9.png

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/dask/dask-ec2/issues/41#issuecomment-270421176, or mute the thread https://github.com/notifications/unsubscribe-auth/AASszFSDg1lMouNFAQru7nkLCwqft05xks5rO82kgaJpZM4LagmO .

raymondchua commented 7 years ago

Thanks, I was going to say I am going to do. But you beat me to it. :)