jianxu305 / nCov2019_analysis

Analysis of 2019-nCov coronavirus data
GNU General Public License v3.0
117 stars 66 forks source link

ModuleNotFoundError: No module named 'utils' #9

Closed YueXiaoqian closed 4 years ago

YueXiaoqian commented 4 years ago

您好,我已经‘’pip install utils‘’,并且显示成功 Requirement already satisfied: utils in d:\python\lib\site-packages (1.0.1)

但是我在python notebook中试图import utils的时候,仍然报错,请问怎么解决呢

ModuleNotFoundError Traceback (most recent call last)

in ----> 1 import utils ModuleNotFoundError: No module named 'utils'
jianxu305 commented 4 years ago

Looks like you didn't set up your path correctly.  Please try to add the path in your system. 

Or if you only want to change the path within this Python session, try:

import sys sys.path.append(...) # add the path to util here

YueXiaoqian commented 4 years ago

Thanks for your response!I can now successfully import the package.But when I run:

data = utils.load_chinese_data()

There is an URLError: URLError: <urlopen error [Errno 11004] getaddrinfo failed>

jianxu305 commented 4 years ago

URLError? can you print out the stack trace and see which URL is it requesting?

YueXiaoqian commented 4 years ago

gaierror Traceback (most recent call last) ~\anaconda3\lib\urllib\request.py in do_open(self, http_class, req, **http_conn_args) 1318 h.request(req.get_method(), req.selector, req.data, headers, -> 1319 encode_chunked=req.has_header('Transfer-encoding')) 1320 except OSError as err: # timeout error

~\anaconda3\lib\http\client.py in request(self, method, url, body, headers, encode_chunked) 1251 """Send a complete request to the server.""" -> 1252 self._send_request(method, url, body, headers, encode_chunked) 1253

~\anaconda3\lib\http\client.py in _send_request(self, method, url, body, headers, encode_chunked) 1297 body = _encode(body, 'body') -> 1298 self.endheaders(body, encode_chunked=encode_chunked) 1299

~\anaconda3\lib\http\client.py in endheaders(self, message_body, encode_chunked) 1246 raise CannotSendHeader() -> 1247 self._send_output(message_body, encode_chunked=encode_chunked) 1248

~\anaconda3\lib\http\client.py in _send_output(self, message_body, encode_chunked) 1025 del self._buffer[:] -> 1026 self.send(msg) 1027

~\anaconda3\lib\http\client.py in send(self, data) 965 if self.auto_open: --> 966 self.connect() 967 else:

~\anaconda3\lib\http\client.py in connect(self) 1413 -> 1414 super().connect() 1415

~\anaconda3\lib\http\client.py in connect(self) 937 self.sock = self._create_connection( --> 938 (self.host,self.port), self.timeout, self.source_address) 939 self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)

~\anaconda3\lib\socket.py in create_connection(address, timeout, source_address) 706 err = None --> 707 for res in getaddrinfo(host, port, 0, SOCK_STREAM): 708 af, socktype, proto, canonname, sa = res

~\anaconda3\lib\socket.py in getaddrinfo(host, port, family, type, proto, flags) 751 addrlist = [] --> 752 for res in _socket.getaddrinfo(host, port, family, type, proto, flags): 753 af, socktype, proto, canonname, sa = res

gaierror: [Errno 11004] getaddrinfo failed

During handling of the above exception, another exception occurred:

URLError Traceback (most recent call last)

in ----> 1 data = utils.load_chinese_data() E:\Jupyter File\nCov2019_analysis-master\src\utils.py in load_chinese_data() 100 def load_chinese_data(): 101 ''' This includes some basic cleaning''' --> 102 data = load_chinese_raw() 103 return rename_cities(data) 104 E:\Jupyter File\nCov2019_analysis-master\src\utils.py in load_chinese_raw() 108 This provides a way to lookinto the 'raw' data 109 ''' --> 110 raw = pd.read_csv(_DXY_DATA_FILE_) 111 112 # the original CSV column names are in camel case, change to lower_case convention ~\anaconda3\lib\site-packages\pandas\io\parsers.py in parser_f(filepath_or_buffer, sep, delimiter, header, names, index_col, usecols, squeeze, prefix, mangle_dupe_cols, dtype, engine, converters, true_values, false_values, skipinitialspace, skiprows, skipfooter, nrows, na_values, keep_default_na, na_filter, verbose, skip_blank_lines, parse_dates, infer_datetime_format, keep_date_col, date_parser, dayfirst, cache_dates, iterator, chunksize, compression, thousands, decimal, lineterminator, quotechar, quoting, doublequote, escapechar, comment, encoding, dialect, error_bad_lines, warn_bad_lines, delim_whitespace, low_memory, memory_map, float_precision) 674 ) 675 --> 676 return _read(filepath_or_buffer, kwds) 677 678 parser_f.__name__ = name ~\anaconda3\lib\site-packages\pandas\io\parsers.py in _read(filepath_or_buffer, kwds) 429 # See https://github.com/python/mypy/issues/1297 430 fp_or_buf, _, compression, should_close = get_filepath_or_buffer( --> 431 filepath_or_buffer, encoding, compression 432 ) 433 kwds["compression"] = compression ~\anaconda3\lib\site-packages\pandas\io\common.py in get_filepath_or_buffer(filepath_or_buffer, encoding, compression, mode) 170 171 if isinstance(filepath_or_buffer, str) and is_url(filepath_or_buffer): --> 172 req = urlopen(filepath_or_buffer) 173 content_encoding = req.headers.get("Content-Encoding", None) 174 if content_encoding == "gzip": ~\anaconda3\lib\site-packages\pandas\io\common.py in urlopen(*args, **kwargs) 139 import urllib.request 140 --> 141 return urllib.request.urlopen(*args, **kwargs) 142 143 ~\anaconda3\lib\urllib\request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context) 220 else: 221 opener = _opener --> 222 return opener.open(url, data, timeout) 223 224 def install_opener(opener): ~\anaconda3\lib\urllib\request.py in open(self, fullurl, data, timeout) 523 req = meth(req) 524 --> 525 response = self._open(req, data) 526 527 # post-process response ~\anaconda3\lib\urllib\request.py in _open(self, req, data) 541 protocol = req.type 542 result = self._call_chain(self.handle_open, protocol, protocol + --> 543 '_open', req) 544 if result: 545 return result ~\anaconda3\lib\urllib\request.py in _call_chain(self, chain, kind, meth_name, *args) 501 for handler in handlers: 502 func = getattr(handler, meth_name) --> 503 result = func(*args) 504 if result is not None: 505 return result ~\anaconda3\lib\urllib\request.py in https_open(self, req) 1360 def https_open(self, req): 1361 return self.do_open(http.client.HTTPSConnection, req, -> 1362 context=self._context, check_hostname=self._check_hostname) 1363 1364 https_request = AbstractHTTPHandler.do_request_ ~\anaconda3\lib\urllib\request.py in do_open(self, http_class, req, **http_conn_args) 1319 encode_chunked=req.has_header('Transfer-encoding')) 1320 except OSError as err: # timeout error -> 1321 raise URLError(err) 1322 r = h.getresponse() 1323 except: URLError:
jianxu305 commented 4 years ago

Can you copy the following link and paste into your web browser to see if you can read it? https://raw.githubusercontent.com/BlankerL/DXY-2019-nCoV-Data/master/csv/DXYArea.csv

If not, then probably you have a firewall or something else blocking the link. Can you access GitHub in general?

YueXiaoqian commented 4 years ago

I can't read it in my browser,although i have aleady close my firewall.it's weird.I need to check my computer Settings.Thanks a lot!