Closed abdullah693 closed 5 years ago
Looks very similar to #2108 ?
Yes exactly. I've attached a test notebook that fails. Basically any notebook that exceeds in size beyond 1 MB fails in my experience. test1.zip
I've tested a few notebooks and was able to create >10MB without any problem, both fresh and via upload, in various environments (the PUT request returns a 200 from Tornado).
However, when you open Cloud Datalab via Cloud Shell Web Preview, [1] a simple Nginx proxy is created under devshell.appspot.com to display this preview. The proxy has an upload limit by design. Can you try opening Datalab locally or on a VM instead?
Thank you that fixed it. I was using cloud shell to connect to datalab. Connecting from command line has fixed the issue.
Even after updating to the latest version of datalab I cannot save Jupyter Notebooks exceeding 1.07 MB in size. Clicking "save and checkpoint" results in "autosave failed!" error. I searched online and found out that the error might be due to Jupyter 4.0 but the latest datalab release runs Jupyter 5.6.0, so that is not the case.