Open tengyuntian opened 2 years ago
@tengyuntian can you please share a screenshot of what you were seeing?
This happens again. The screenshot is attached.
similar to my issue, but not identical behavior: https://github.com/microsoft/vscode/issues/155820
Happened again. As the pic shows. I have to restart the .ipynb script.
@tengyuntian does it happen all the time or it only happens when you run some cells which emit rich outputs (with widgets)?
@tengyuntian does it happen all the time or it only happens when you run some cells which emit rich outputs (with widgets)?
It happened when I run cells with huge outputs. Also, it will happen sometimes when my ipynb is too long.
@tengyuntian thanks, that's really helpful. Would it make sense to provide a code snippet that get your VS Code into this state and share with us?
@tengyuntian thanks, that's really helpful. Would it make sense to provide a code snippet that get your VS Code into this state and share with us?
My script is growing longer, and the data need to be processed is becoming larger and larger with my research going on. So the "grey background" problems happen more and more often. I have to close the Jupyter Notebook and open it again to eliminate it. But this action results in longer waiting time for re-running all functions and importing all data. I think the key problem is that the amount of data is too large. The data is in .txt format. I have over 30 txt files with a total number of lines over 1 million. And they will be plotted into hundreds of plots to show the experimental results.
I found that every time it is running hard and has no responding, if I scroll the interface up/down, it will crash and the background becomes grey, and all the outputs are hidden in the grey.
Is there anything I can do to avoid this? Every time I open a notebook with heavy data outputs, it will become grey and sometimes not responding. When it becomes grey, I cannot see or select any output and I have to re-open and re-run everything, then it happens again.
I am running into the same issue these days again and again, is there any workaround to figure it out? I cannot do anything if I work with the visual studio code, help
I am also experiencing this issue. VSCode+Jupyter is completely unusable.
@kevindany, @tengyuntian I was able to resolve this by exporting my .ipynb
to a .py
file, then converting the .py
file back into a .ipynb
.
1) Open the problem .ipynb
2) CTRL+SHIFT+P
3) Select Jupyter: Convert to Python Script
4) Open the .py file
5) Right click whitespace in the file and select Export Current File as Jupyter Notebook
Bonus points, you should be able to use git mv
to carry over the versioning from the problematic file to the reconstituted one. (Rename the broken file to literally anything, then rename the reconstituted file to be same as the broken file's original name using git mv
)
@kevindany, @tengyuntian I was able to resolve this by exporting my
.ipynb
to a.py
file, then converting the.py
file back into a.ipynb
.
- Open the problem .ipynb
CTRL+SHIFT+P
- Select
Jupyter: Convert to Python Script
- Open the .py file
- Right click whitespace in the file and select
Export Current File as Jupyter Notebook
Bonus points, you should be able to use
git mv
to carry over the versioning from the problematic file to the reconstituted one. (Rename the broken file to literally anything, then rename the reconstituted file to be same as the broken file's original name usinggit mv
)
It does not work :-(, it grays again
Same issue here. As soon as my notebook reaches a certain length, it turns grey/black whenever I run any cell with a longer output or a graph.
I am also having this issue when I make plots with Plotly over large data. I tried to make a jupyter notebook for each large plot (one plot per notebook) but it still does not work. I didn't have this issue before this (I have had plots with large datasets before, but it was okay). Is there a temporary solution for this? like clearing a cache or something to temporarily fix the problem?
I'm having the same problem with plotly plots of large point clouds, in addition to turning gray I get " VSCode not reponding" and the program crahes everytime I try to poen that notebook.
I am also experiencing the same issue. Does anyone found anything helpful?
Same. VS Code and Jupyter Notebooks do not appear to be working well together...
Same on the latest ubuntu, latest vscode.
Not sure if plotly is to blame since deleting the renderer extension didn't help. It might be just due to the big length of the notebook.
@kevindany, @tengyuntian I was able to resolve this by exporting my
.ipynb
to a.py
file, then converting the.py
file back into a.ipynb
.
Thanks, it worked.
same here with a large-ish notebook. ctrl+shift+P -> Developer: Reload Window
fixes the problem (easier than renaming), but that also restarts the notebook which is not ideal
Found the same problem
Cells background become grey, editor becomes sluggish, modal window pops ups saying window is not responding.
This behaviour happens with a notebook with a cell containing a function that reads a 200Mb file. Tried the previous workarounds (exporting file, reload window, copy to new notebook) but it keeps happening again.
Version: 1.77.3 (Universal) Commit: 704ed70d4fd1c6bd6342c436f1ede30d1cff4710 Date: 2023-04-12T09:19:37.325Z Electron: 19.1.11 Chromium: 102.0.5005.196 Node.js: 16.14.2 V8: 10.2.154.26-electron.0 OS: Darwin x64 22.4.0 Sandboxed: Yes
Using python kernel 3.10.9
same problem here on ubuntu 20.04.
Hi, has this issue been resolved yet? I keep getting it, also with large files, but it's getting more frequent
same problum her O.o any1 no ware probum iz?
Background grays out for medium sized notebooks (vscode: Version: 1.74.3) - must be memory issue. Does not happen for small notebooks but should be considered a SERIOUS BUG as all output in notebook becomes invisible.
I'm running into this same issue (vscode version 1.79.1), but for me it appears to be tied to the libraries I'm importing into the notebook, not the size of the dataframes I'm operating on. For example, if I import plotly, this issue arises even if I never use any plotly functions in the notebook - even basic pandas operations (e.g. df.head()) will result in the graying out behavior.
UPDATE: I don't know why this fixed the issue, but switching from conda/mamba to poetry as my package manager resolved the problems I was having.
I see the same issue (version 1.79.1 Darwin ARM). The first sign of it is that the timer next to the spinning wheel tends to update slower and Code Helper (Renderer) seems to use a lot of CPU (>100%). At some point the underlying process that was running in the cell (in my case R) stops using CPU but the renderer doesn't detect that the process is done and the spinning arrows keep turning. This even after a complete machine restart with low memory pressure (R is using about 1 GB of memory). Eventually after this I will get the grey background shown in screenshots and VSCode becomes unusable (at least on that window, even quitting takes a while). I tried hiding the output of the cell before running, but doesn't seem to make a difference. The timer gets updated later retrospectively (so showing initially a longer time and then getting back to 1m 29s as shown below.
After that, I cannot create any more cells on that notebook.
Update: Turning off warnings (there were a lot of it being written by R into the cell output) in R, avoids the problem. So it seems to be an issue with the amount of IO written into the cell output. Having said that, I have been producing a lot of these calls in the past (with loads of warnings) without issues.
Hi ! This issue has almost been around for a year now. In there any workaround to avoid this issue, making the interface becoming grey and the cells outputs to disappear ?
I haven't had this issue until today. Now for every simple output I'm trying to have in the vscode jupyter notebook - I get a gray interface. Is there a workaround?
Not sure what the catalyst that started it, but I get a grey screen every time I'm trying to use jupyter. Tried to uninstall/reinstall all, tried to rollback to extensions from 1 week ago. Nothing helps.
My system is:
Version: 1.80.0 (system setup)
Commit: 660393deaaa6d1996740ff4880f1bad43768c814
Date: 2023-07-04T15:06:02.407Z
Electron: 22.3.14
ElectronBuildId: 21893604
Chromium: 108.0.5359.215
Node.js: 16.17.1
V8: 10.8.168.25-electron.0
OS: Windows_NT x64 10.0.19045
Jupyter: v2023.5.1101742258
Pylance: v2023.6.40
Python: v2023.10.1
Update: For me, this issue seems to be fixed. I believe that the first time it happened was by crashing the notebook by executing this kind of code:
x=np.logspace(0,1,2**24,32,)-1
x=((2**24)*x/x.max()).round().astype(int)
print([str(z) for z in x])
In the web-based interface of jupyter notebook I receive the following from it:
IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable
`--NotebookApp.iopub_data_rate_limit`.
Current values:
NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
NotebookApp.rate_limit_window=3.0 (secs)
Apparently, the issue was that I opened a notebook that already run this code, and thus was corrupted. Once I switched to a fresh notebook and executed another code (that doesn't have a bug that outputs a string of size of 134217727 characters), all seems to go well.
Same problem here. Also with a longish ipynb file. Surely there MUST be a way to solve this?
I also have the same problem. When I create plotly Figures that contain a lot of data vscode crashes in the behaviour described in the previous comments. Hopefully there is a solution soon
i also have a similar problem
me too
I have the same issue. When I run my code in Jupyter Notebook I'm getting the following error -> how can I solve the problem for VSCode?
Same problem here. Only installed python and jupyter plugins.
Same problem here. Triggering code is below. Below about 20,000 data points it's fine. Above that it'll eventually "grey screen".
open Plotly.NET
open Plotly.NET.TraceObjects
open Plotly.NET.LayoutObjects
let mapCentre =
exceedances
|> Seq.averageBy (fun m -> m.Position.Longitude),
exceedances
|> Seq.averageBy (fun m -> m.Position.Latitude)
clusters
|> Seq.map (fun cluster ->
let coordsAsTuples =
cluster.Exceedances
// Mapbox assumes a Lon, Lat convention:
|> Seq.map (fun ex -> ex.Position.Longitude, ex.Position.Latitude)
Chart.PointMapbox(coordsAsTuples, Name=cluster.Id))
|> Chart.combine
|> Chart.withSize(1200, 800)
|> Chart.withMarkerStyle(Size=10)
|> Chart.withMarginSize(Left=0,Right=0,Top=30,Bottom=0)
|> Chart.withTitle("Clusters")
|> Chart.withMapbox(Mapbox.init(Style=StyleParam.MapboxStyle.OpenStreetMap, Center=mapCentre, Zoom=12.))
Related on Stack Overflow:
Potentially related on r/vscode: https://www.reddit.com/r/vscode/comments/17kg5sg/help_with_ipynb_files/
In my case : i was using pd.read_csv(file,low_memory=False)
and when i removed the low_memory
and restarted and cleared all outputs the grey boxes dissapeared
Does everyone having this issue have the copilot extension?
Anecdotal, but his wasn't happening for me until I installed it. My only workaround has been to create a new notebook.
I do not have it and still happens to me.
Also having this issue. I have a .ipynb file that uses win32com to open an Excel file, refresh all queries in the Excel file (Excel file is reading from Access), saving the Excel file, closing the Excel file, then deleting the instance of Excel in the notebook. I then read in a few sheets from the stored Excel object and do some cleaning, concat, etc. The notebook background goes gray and I can't see the output. Really frustrating.
Issue Type: Bug
I am using Jupyter Notebook in VS code. When I wheeled down the interface, it became grey. It happened before. Cannot see any pics clearly.
VS Code version: Code 1.69.1 (b06ae3b2d2dbfe28bca3134cc6be65935cdfea6a, 2022-07-12T08:21:24.514Z) OS version: Windows_NT x64 10.0.19044 Restricted Mode: No
System Info
|Item|Value| |---|---| |CPUs|Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz (12 x 3192)| |GPU Status|2d_canvas: enabledcanvas_oop_rasterization: disabled_off
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
skia_renderer: enabled_on
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled| |Load (avg)|undefined| |Memory (System)|31.80GB (21.36GB free)| |Process Argv|--crash-reporter-id 1c7786f7-3567-42d0-bd46-6060e36fc1d0| |Screen Reader|no| |VM|0%|
Extensions (7)
Extension|Author (truncated)|Version ---|---|--- copilot|Git|1.31.6194 python|ms-|2022.10.0 vscode-pylance|ms-|2022.7.30 jupyter|ms-|2022.6.1101950301 jupyter-keymap|ms-|1.0.0 jupyter-renderers|ms-|1.0.8 cpptools|ms-|1.10.8 (2 theme extensions excluded)A/B Experiments
``` vsliv368cf:30146710 vsreu685:30147344 python383:30185418 vspor879:30202332 vspor708:30202333 vspor363:30204092 vslsvsres303:30308271 pythonvspyl392:30443607 vserr242cf:30382550 pythontb:30283811 vsjup518:30340749 pythonvspyt551cf:30345471 pythonptprofiler:30281270 vshan820:30294714 vstes263:30335439 pythondataviewer:30285071 vscod805cf:30301675 binariesv615:30325510 bridge0708:30335490 bridge0723:30353136 vsaa593:30376534 vsc1dst:30438360 pythonvs932:30410667 wslgetstarted:30449410 vscscmwlcmt:30465135 cppdebug:30492333 vsclangdf:30486550 ```