Closed glouppe closed 8 years ago
I am closing this. Sorry for the noise. It seems it is rather related to binder. I have the exact same error with a different independent package.
Not pure noise, I just found out about mybinder.org :-)
Neil
On Fri, Mar 4, 2016 at 4:43 PM, Gilles Louppe notifications@github.com wrote:
I am closing this. Sorry for the noise. It seems it is rather related to binder. I have the exact same error with a different independent package.
— Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/319#issuecomment-192350683.
mybinder is cool. Found this a couple of weeks ago http://ivory.idyll.org/blog/2016-mybinder.html
Hi, I have the same issue on console and JupyterNotebook with Python 3.5. Having tried everything I found on the web, but for some reason,
detasets.py
json_data=open(path).read()
always calls
.../lib/python3.5/encodings/ascii.py
so, all I could do is to modify the sourcecode to give encoding option enforcing to use 'utf-8' of the function in detasets.py
as below:
if not (on_rtd):
path = os.path.join(os.path.dirname(__file__), 'data_resources.json')
json_data=open(path, encoding='utf-8').read()
data_resources = json.loads(json_data)
if not (on_rtd):
path = os.path.join(os.path.dirname(__file__), 'football_teams.json')
json_data=open(path, encoding='utf-8').read()
football_dict = json.loads(json_data)
It would be pretty that the developer give the encoding='utf-8'
option from next update, otherwise I probably have to make this patch on every release :(
Thanks!
Hi Ken,
We can easily make that change, but it's much better if you submit it by a pull request so you get the correct credit for fixing the issue and we keep the paper trail for what the problem was.
https://help.github.com/articles/using-pull-requests/
To understand correctly, is it an issue with reading those files on Python3.5 in general?
Neil
On Fri, Apr 22, 2016 at 2:15 AM, Ken OKABE notifications@github.com wrote:
Hi, I have the same issue on console and JupyterNotebook with Python 3.5. Having tried everything I found on the web, but for some reason,
detasets.py
json_data=open(path).read()
always calls .../lib/python3.5/encodings/ascii.py
so, all I could do is to modify the sourcecode to give encoding option of the function in detasets.py as below:
if not (on_rtd): path = os.path.join(os.path.dirname(file), 'data_resources.json') json_data=open(path, encoding='utf-8').read() data_resources = json.loads(json_data)
if not (on_rtd): path = os.path.join(os.path.dirname(file), 'football_teams.json') json_data=open(path, encoding='utf-8').read() football_dict = json.loads(json_data)
It would be pretty that the developer give the encoding='utf-8' option from next update, otherwise I probably have to make this patch on every release :(
Thanks!
— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/319#issuecomment-213277974
Hi Neil,
Thank you for this stunning project.
I made a pull request: https://github.com/SheffieldML/GPy/pull/378
To understand correctly, is it an issue with reading those files on Python3.5 in general?
Basically, I'm a JavaScript/node developer with insufficient experience for Python, so I'm sorry I could not figure out how this error occurred to my environment.
So far, I have installed many other packages via pip
, including tensor-flow
and theano
, where they could download and read the data-files without problems. For very unknown reason, unfortunately GPy
is the only one that has this issue so far, and I also suspected my own Python3.5 global encoding configuration, and searched the web and tripple-checked, but seems no problem.
Anyway, encoding issue seems notorious in Python environment, and this fix should be useful in general I believe.
Thanks! Ken
Thanks so much Ken! It's not really me, it's a community effort :-)
Max (@mzwiessele) is really the one driving things forward with the pull requests and taking the lead on the releases.
Encoding does indeed seem to be a nightmare with python, and the problem is compounded by the fact that we are currently releasing code for python 2 & 3.
On Fri, Apr 22, 2016 at 6:39 AM, Ken OKABE notifications@github.com wrote:
Hi Neil,
Thank you for this stunning project.
I made a pull request: #378 https://github.com/SheffieldML/GPy/pull/378
To understand correctly, is it an issue with reading those files on Python3.5 in general?
Basically, I'm a JavaScript/node developer with insufficient experience for Python, so I'm sorry I could not figure out how this error occurred to my environment.
So far, I have installed many other packages via pip, including tensor-flow and theano, where they could download and read the data-files without problems. For very unknown reason, unfortunately GPy is the only one that has this issue so far, and I also suspected my own Python3.5 global encoding configuration, and searched the web and tripple-checked, but seems no problem.
Anyway, encoding issue seems notorious in Python environment, and this fix should be useful in general I believe.
Thanks! Ken
— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/319#issuecomment-213373932
Hi Neil,
It's also my pleasure to have been able to contribute this project even a little :)
Actually, while studying machine learning on the basis of not NN but probabilistic in a generalized framework, I have read your papers on Gaussian process latent variable models. and felt truly epoch-making research. In fact, I'm looking forward to your book about GP for next generation as the successor of PRML by Christopher M. Bishop!
Hi,
I am trying to make public on mybinder.org some notebooks of mine where GPy is used. Installation works fine, but importing
GPy
fails with the following error:GPy is installed from the
devel
branch within the following conda environment https://github.com/diana-hep/carl-notebooks/blob/master/environment.yml#L14The tricky issue is that I cant reproduce the problem locally. The same environment file works fine on my machine :(
CC'ing some Binder people to the rescue @rgbkrk @freeman-lab