det-lab / jupyterhub-deploy-kubernetes-jetstream

CDMS JupyterHub deployment on XSEDE Jetstream
0 stars 1 forks source link

File Save Error for jupyter notebook #11

Closed ranchen2025 closed 4 years ago

ranchen2025 commented 4 years ago

The error info is in the picture. It will show up every time the jupyter notebook wants to save it automatically or I click on save. save_error

pibion commented 4 years ago

Thanks for the issue @ranchen2025! You may want to @zonca on such posts, this issue didn't automatically show up in my email.

zonca commented 4 years ago

please upload the notebook to http://gist.github.com

and if you need some input data, please also explain me how to get them.

I think the problem is that your notebook is too big.

pibion commented 4 years ago

@ranchen2025 could you post links to your data repository and notebook repository?

ranchen2025 commented 4 years ago

Hello, Sorry for the late reply. The link of the repository include my notebook and data is: https://github.com/ranchen2025/NR5.git Will this work?

Best regards, Ran Chen

On Fri, Mar 20, 2020 at 9:06 PM pibion notifications@github.com wrote:

@ranchen2025 https://github.com/ranchen2025 could you post links to your data repository and notebook repository?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/det-lab/jupyterhub-deploy-kubernetes-jetstream/issues/11#issuecomment-601978054, or unsubscribe https://github.com/notifications/unsubscribe-auth/AO4LZXSAJYMMTMWYEPDCARDRIQOLBANCNFSM4LQRD5SQ .

pibion commented 4 years ago

@ranchen2025 I can access those, thanks!

zonca commented 4 years ago

First time I see a Notebook with more than 100 cells, I think JupyterLab cannot save it because once you produce the plots, it is several MB. Is there a reason it is so long instead of creating more smaller notebooks?

Can you test with smaller notebooks and see if you find the same issue?

ranchen2025 commented 4 years ago

Hello,

Thanks! Your advice does help a lot. The issue doesn't show up after I use smaller notebooks. I will avoid make so large notebook in the future.

Best regards, Ran Chen

On Tue, Mar 24, 2020 at 7:25 PM Andrea Zonca notifications@github.com wrote:

First time I see a Notebook with more than 100 cells, I think JupyterLab cannot save it because once you produce the plots, it is several MB. Is there a reason it is so long instead of creating more smaller notebooks?

Can you test with smaller notebooks and see if you find the same issue?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/det-lab/jupyterhub-deploy-kubernetes-jetstream/issues/11#issuecomment-603573111, or unsubscribe https://github.com/notifications/unsubscribe-auth/AO4LZXRYDN4IO5KSVZ7GG43RJFFQBANCNFSM4LQRD5SQ .

zonca commented 4 years ago

actually there is probably a fix, see https://github.com/jupyterlab/jupyterlab/issues/4214, I'll try to implement this in the setup.

zonca commented 4 years ago

@ranchen2025 @pibion I've increased the size to 20 MB, so it should not cause problems even with very large notebooks.

blackholesun137 commented 4 years ago

I'm still having issues with text files above a certain size. I've tried uploading a 15 MB text file, as well as opening a blank text file in Jupyter, copy-pasting the data there, and trying to save, but, in either case, I get the error, "Invalid response: 413 Request Entity Too Large." I have been able to upload a 1 KB text file, so the issue can't be with text files in general.

zonca commented 4 years ago

confirmed. I guess there is some margin. I increased the limit from 20MB to 100MB and successfully uploaded a 16MB PDF. can you try again?

blackholesun137 commented 4 years ago

I was able to upload it this time. Thank you.