microsoft / vscode-jupyter

VS Code Jupyter extension
https://marketplace.visualstudio.com/items?itemName=ms-toolsai.jupyter
MIT License
1.28k stars 287 forks source link

Cell stops execution without warning when using WSL: Ubuntu #16063

Open dominicdill opened 1 day ago

dominicdill commented 1 day ago

Applies To

What happened?

Using wsl2 Distributor ID: Ubuntu Description: Ubuntu 24.04.1 LTS Release: 24.04 Codename: noble

Issue shown in this youtube video link and the attached video file (attached video file was edited a lot to reduce total size and allow upload).

https://github.com/user-attachments/assets/55a56325-b577-4e63-afcb-8ca70b16fcd9

Requesting data from a url and storing to a dataframe. Jupyter Notebook cell where this is happening will unexpectedly cease execution without warning, returning a dataframe missing crucial information. You can see from the video that I am successfully grabbing information from 3 different station_ids, but because of the unexpected exit, my dataframe is only holding information for the first two stations (the one ending in 20 and 60, missing the one ending in 40). There are 5 unique stations I should be pulling data from in this loop, and I have been able to do this successfully, so I'm not sure what is going on.

i've been able to run through the cell at least once and grab all the required information, but I've tried 10+ times with failure and haven't gotten closer to figuring out why it's failing.

also, as you can see, the time reported for cell execution time is incorrect.

I'm also experiencing issues where my "focus" in jupyter notebooks will randomly change. Like i'm typing in a cell and all of a sudden my screen view shifts my cursor has moved somewhere else and I have to click back into the cell and line i was working on. this is happening regularly.

VS Code Version

Version: 1.93.1 (user setup) Commit: 38c31bc77e0dd6ae88a4e9cc93428cc27a56ba40 Date: 2024-09-11T17:20:05.685Z Electron: 30.4.0 ElectronBuildId: 10073054 Chromium: 124.0.6367.243 Node.js: 20.15.1 V8: 12.4.254.20-electron.0 OS: Windows_NT x64 10.0.22631

Jupyter Extension Version

2024.8.1

Jupyter logs

12:25:33.844 [info] Restarted 7e895747-3ac4-4c31-98a4-33b293c24702
12:28:51.320 [info] Interrupt kernel execution
12:28:51.321 [info] Interrupt requested ~/Capstone/project-dominicdill/testing.ipynb
12:28:51.329 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:28:51.332 [warn] Cell completed with errors (cancelled)
12:28:51.334 [info] Interrupt kernel execution
12:28:51.334 [info] Interrupting kernel: python3125jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830
12:28:51.334 [info] Interrupting kernel via SIGINT
12:28:51.339 [warn] Cancel all remaining cells due to cancellation or failure in execution
12:28:51.341 [info] Interrupt requested & sent for ~/Capstone/project-dominicdill/testing.ipynb in notebookEditor.
12:28:52.087 [warn] Unhandled message found: execute_reply

More of the log, but I'm not sure what it contains/refers to. Above log shows what happens after i restart my kernel and run into the issue.

Visual Studio Code (1.93.1, wsl, desktop)
Jupyter Extension Version: 2024.8.1.
Python Extension Version: 2024.14.1.
Pylance Extension Version: 2024.9.1.
Platform: linux (x64).
Temp Storage folder ~/.vscode-server/data/User/globalStorage/ms-toolsai.jupyter/version-2024.8.1
Workspace folder ~/Capstone/project-dominicdill, Home = /home/ddd
11:05:55.387 [info] Starting Kernel (Python Path: ~/miniconda3/envs/playground/bin/python, Conda, 3.12.5) for '~/Capstone/project-dominicdill/testing.ipynb' (disableUI=true)
11:06:00.117 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
11:06:00.126 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v34ff1c91328d8340e499914cdbd82562998b72806.json
    > cwd: //home/~/Capstone/project-dominicdill
11:06:00.161 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m pip list
11:06:03.209 [info] Kernel successfully started
11:06:03.225 [info] Process Execution: ~/miniconda3/envs/playground/bin/python /home/~/.vscode-server/extensions/ms-toolsai.jupyter-2024.8.1-linux-x64/pythonFiles/printJupyterDataDir.py
11:08:47.802 [info] Interrupt kernel execution
11:08:47.803 [info] Interrupt requested ~/Capstone/project-dominicdill/testing.ipynb
11:08:47.804 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:08:47.805 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:08:47.805 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:08:47.806 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:08:47.823 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:08:47.834 [warn] Cell completed with errors (cancelled)
11:08:47.837 [info] Interrupt kernel execution
11:08:47.837 [info] Interrupting kernel: python3125jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830
11:08:47.838 [info] Interrupting kernel via SIGINT
11:08:47.855 [info] Interrupt requested & sent for ~/Capstone/project-dominicdill/testing.ipynb in notebookEditor.
11:08:48.640 [warn] Unhandled message found: execute_reply
11:10:12.030 [info] Restart requested ~/Capstone/project-dominicdill/testing.ipynb
11:10:12.045 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
11:10:12.062 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v3dcd9c1cbe1ecbef66925ceedefc126a18af18531.json
    > cwd: //home/~/Capstone/project-dominicdill
11:10:12.513 [info] Restarted 7e895747-3ac4-4c31-98a4-33b293c24702
11:10:18.263 [info] Restart requested ~/Capstone/project-dominicdill/testing.ipynb
11:10:18.282 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
11:10:18.305 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v383ac4fa90b66643f86c603a0ac82bb8d25a5954f.json
    > cwd: //home/~/Capstone/project-dominicdill
11:10:18.745 [info] Restarted 7e895747-3ac4-4c31-98a4-33b293c24702
11:15:15.945 [info] Interrupt kernel execution
11:15:15.946 [info] Interrupt requested ~/Capstone/project-dominicdill/testing.ipynb
11:15:15.953 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:15:15.960 [warn] Cell completed with errors (cancelled)
11:15:15.960 [info] Interrupt kernel execution
11:15:15.961 [info] Interrupting kernel: python3125jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830
11:15:15.961 [info] Interrupting kernel via SIGINT
11:15:15.964 [warn] Cancel all remaining cells due to cancellation or failure in execution
11:15:15.965 [info] Interrupt requested & sent for ~/Capstone/project-dominicdill/testing.ipynb in notebookEditor.
11:15:16.571 [warn] Unhandled message found: execute_reply
11:24:10.783 [info] Restart requested ~/Capstone/project-dominicdill/testing.ipynb
11:24:11.878 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
11:24:11.884 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v355ad749d5ed8bcc7a400a3bb4b17df726dd74d83.json
    > cwd: //home/~/Capstone/project-dominicdill
11:24:12.696 [info] Restarted 7e895747-3ac4-4c31-98a4-33b293c24702
11:25:10.957 [info] Interrupt kernel execution
11:25:10.957 [info] Interrupt requested ~/Capstone/project-dominicdill/testing.ipynb
11:25:10.964 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
11:25:10.966 [warn] Cell completed with errors (cancelled)
11:25:10.967 [info] Interrupt kernel execution
11:25:10.967 [info] Interrupting kernel: python3125jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830
11:25:10.967 [info] Interrupting kernel via SIGINT
11:25:10.972 [warn] Cancel all remaining cells due to cancellation or failure in execution
11:25:10.974 [info] Interrupt requested & sent for ~/Capstone/project-dominicdill/testing.ipynb in notebookEditor.
11:25:11.715 [warn] Unhandled message found: execute_reply
12:08:14.849 [warn] Cell completed with errors nu [Error]: invalid syntax (172347264.py, line 1)
    at n.execute (/home/~/.vscode-server/extensions/ms-toolsai.jupyter-2024.8.1-linux-x64/dist/extension.node.js:297:4958) {
  ename: 'SyntaxError',
  evalue: 'invalid syntax (172347264.py, line 1)',
  traceback: [
    '\x1B[0;36m  Cell \x1B[0;32mIn[6], line 1\x1B[0;36m\x1B[0m\n' +
      '\x1B[0;31m    Jupter Show Outpit\x1B[0m\n' +
      '\x1B[0m           ^\x1B[0m\n' +
      '\x1B[0;31mSyntaxError\x1B[0m\x1B[0;31m:\x1B[0m invalid syntax\n'
  ]
}
12:08:18.373 [warn] Cell completed with errors nu [Error]: invalid syntax (247217307.py, line 1)
    at n.execute (/home/~/.vscode-server/extensions/ms-toolsai.jupyter-2024.8.1-linux-x64/dist/extension.node.js:297:4958) {
  ename: 'SyntaxError',
  evalue: 'invalid syntax (247217307.py, line 1)',
  traceback: [
    '\x1B[0;36m  Cell \x1B[0;32mIn[7], line 1\x1B[0;36m\x1B[0m\n' +
      '\x1B[0;31m    Jupter Show Output\x1B[0m\n' +
      '\x1B[0m           ^\x1B[0m\n' +
      '\x1B[0;31mSyntaxError\x1B[0m\x1B[0;31m:\x1B[0m invalid syntax\n'
  ]
}
12:13:09.199 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:13:09.199 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:13:09.199 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:13:09.200 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:13:09.200 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:13:11.007 [info] Restart requested ~/Capstone/project-dominicdill/testing.ipynb
12:13:12.188 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
12:13:12.192 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v356fd1e540cefb3660f8fdec4769bfd02cfaddf0c.json
    > cwd: //home/~/Capstone/project-dominicdill
12:13:12.726 [info] Restarted 7e895747-3ac4-4c31-98a4-33b293c24702
12:25:32.275 [info] Restart requested ~/Capstone/project-dominicdill/testing.ipynb
12:25:33.021 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
12:25:33.026 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v3c2d78710ba6396c329d6cb77e2ac2424bbe6afbb.json
    > cwd: //home/~/Capstone/project-dominicdill
12:25:33.844 [info] Restarted 7e895747-3ac4-4c31-98a4-33b293c24702
12:28:51.320 [info] Interrupt kernel execution
12:28:51.321 [info] Interrupt requested ~/Capstone/project-dominicdill/testing.ipynb
12:28:51.329 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
12:28:51.332 [warn] Cell completed with errors (cancelled)
12:28:51.334 [info] Interrupt kernel execution
12:28:51.334 [info] Interrupting kernel: python3125jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830
12:28:51.334 [info] Interrupting kernel via SIGINT
12:28:51.339 [warn] Cancel all remaining cells due to cancellation or failure in execution
12:28:51.341 [info] Interrupt requested & sent for ~/Capstone/project-dominicdill/testing.ipynb in notebookEditor.
12:28:52.087 [warn] Unhandled message found: execute_reply

Coding Language and Runtime Version

python 3.12.5

Language Extension Version (if applicable)

python 2024.14.1

Anaconda Version (if applicable)

24.7.1

Running Jupyter locally or remotely?

N/A or Not sure

dominicdill commented 1 day ago

detailed log with trace. I think the issue is happening around this time:

finally:
    del __DW_GET_DF_VARS__
13:37:23.406 [debug] VS Code interrupted kernel for ~/Capstone/project-dominicdill/testing.ipynb
Visual Studio Code (1.93.1, wsl, desktop)
Jupyter Extension Version: 2024.8.1.
Python Extension Version: 2024.14.1.
Pylance Extension Version: 2024.9.1.
Platform: linux (x64).
Temp Storage folder ~/.vscode-server/data/User/globalStorage/ms-toolsai.jupyter/version-2024.8.1
Workspace folder ~/Capstone/project-dominicdill, Home = /home/ddd
13:35:24.730 [debug] Start refreshing Kernel Picker (1726767324730)
13:35:26.185 [trace] Search for KernelSpecs in Interpreter ~/miniconda3/envs/wsl_playground/bin/python
13:35:26.189 [trace] Search for KernelSpecs in Interpreter /bin/python3
13:35:26.190 [trace] Search for KernelSpecs in Interpreter /usr/bin/python3
13:35:26.191 [trace] Search for KernelSpecs in Interpreter ~/miniconda3/bin/python
13:35:26.192 [trace] Search for KernelSpecs in Interpreter ~/miniconda3/envs/playground/bin/python
13:35:26.194 [debug] Get Custom Env Variables, Class name = Cm, completed in 1464ms, has a truthy return value, Arg 1: undefined, Arg 2: "RunPythonCode"
13:35:26.195 [debug] Jupyter Paths /kernels: 
13:35:26.195 [debug] Kernel Spec Root Paths, /usr/share/jupyter/kernels, /usr/local/share/jupyter/kernels, ~/.local/share/jupyter/kernels
13:35:26.860 [debug] KernelProvider switched kernel to id = .jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830./home/~/miniconda3/envs/playground/python./home/~/miniconda3/envs/playground/python.-m#ipykernel_launcher
13:35:26.861 [debug] start the kernel, options.disableUI=true for ~/Capstone/project-dominicdill/testing.ipynb
13:35:26.898 [trace] Registering commtarget jupyter.widget
13:35:26.902 [debug] Controller selection change completed
13:35:26.956 [debug] Get Custom Env Variables, Class name = Cm, completed in 1ms, has a truthy return value, Arg 1: "~/Capstone/project-dominicdill", Arg 2: "RunPythonCode"
13:35:26.957 [info] Starting Kernel (Python Path: ~/miniconda3/envs/playground/bin/python, Conda, 3.12.5) for '~/Capstone/project-dominicdill/testing.ipynb' (disableUI=true)
13:35:26.958 [trace] Creating raw notebook for resource '~/Capstone/project-dominicdill/testing.ipynb'
13:35:27.129 [debug] Get Custom Env Variables, Class name = Cm, completed in 4ms, has a truthy return value, Arg 1: "~/Capstone/project-dominicdill/testing.ipynb", Arg 2: "RunPythonCode"
13:35:27.526 [trace] Hiding default KernelSpec ~/miniconda3/envs/playground/bin/python for interpreter ~/miniconda3/envs/playground/bin/python (KernelSpec file ~/miniconda3/envs/playground/share/jupyter/kernels/python3/kernel.json)
13:35:27.528 [trace] Hiding default KernelSpec ~/miniconda3/bin/python for interpreter ~/miniconda3/bin/python (KernelSpec file ~/miniconda3/share/jupyter/kernels/python3/kernel.json)
13:35:27.529 [debug] End refreshing Kernel Picker (1726767324730)
13:35:28.078 [debug] Launching kernel .jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830./home/~/miniconda3/envs/playground/python./home/~/miniconda3/envs/playground/python.-m#ipykernel_launcher for ~/Capstone/project-dominicdill/testing.ipynb in //home/~/Capstone/project-dominicdill with ports 9001, 9000, 9004, 9002, 9003
13:35:28.928 [debug] Interpreter for Pylance for Notebook URI "~/Capstone/project-dominicdill/testing.ipynb" is ~/miniconda3/envs/playground/bin/python
13:35:29.188 [trace] Conda file is conda
13:35:31.923 [debug] Got env vars from Python Ext in 4968ms for ~/miniconda3/envs/playground/bin/python, with env var count 56.
13:35:31.924 [trace] Prepend PATH with python bin for ~/miniconda3/envs/playground/bin/python
13:35:31.925 [debug] Getting activated env variables, Class name = gd, completed in 4971ms, has a truthy return value, Arg 1: "~/Capstone/project-dominicdill", Arg 2: "~/miniconda3/envs/playground/bin/python", Arg 3: undefined
13:35:31.930 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m pip list
13:35:31.932 [debug] Got env vars from Python Ext in 4808ms for ~/miniconda3/envs/playground/bin/python, with env var count 56.
13:35:31.932 [trace] Prepend PATH with python bin for ~/miniconda3/envs/playground/bin/python
13:35:31.933 [debug] Getting activated env variables, Class name = gd, completed in 4809ms, has a truthy return value, Arg 1: "~/Capstone/project-dominicdill/testing.ipynb", Arg 2: "~/miniconda3/envs/playground/bin/python", Arg 3: undefined
13:35:31.937 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -c "import ipykernel; print(ipykernel.__version__); print("5dc3a68c-e34e-4080-9c3e-2a532b2ccb4d"); print(ipykernel.__file__)"
13:35:31.942 [info] Process Execution: ~/miniconda3/envs/playground/bin/python -m ipykernel_launcher --f=/home/~/.local/share/jupyter/runtime/kernel-v3e21030c95cc40efa1432f92a0fc5b89a0653~1.json
    > cwd: //home/~/Capstone/project-dominicdill
13:35:31.942 [debug] Kernel process 35634.
13:35:32.245 [trace] ipykernel version & path 6.29.5, ~/miniconda3/envs/playground/lib/python3.12/site-packages/ipykernel/__init__.py for ~/miniconda3/envs/playground/bin/python
13:35:32.719 [debug] Kernel output 35634: To connect another client to this kernel, use:
--existing kernel-v3e21030c95cc40efa1432f92a0fc5b89a0653~1.json
13:35:32.747 [debug] Waiting for Raw Session to be ready in postStartRawSession
13:35:32.747 [debug] Waiting for Raw session to be ready, status: connected
13:35:32.747 [trace] Raw session connected
13:35:32.748 [debug] Waited for Raw session to be ready & got status: connected
13:35:32.748 [debug] Successfully waited for Raw Session to be ready in postStartRawSession
13:35:32.749 [debug] Kernel status is 'unknown' before requesting kernel info and after ready
13:35:32.749 [debug] Sending request for kernelInfo
13:35:32.756 [trace] Got response for requestKernelInfo
13:35:32.757 [debug] Successfully completed postStartRawSession after 1 attempt(s) in 8ms
13:35:32.759 [trace] Executing silently Code (idle) = import sys as _VSCODE_sys\nprint(_VSCODE_sys.executable); del _VSCODE_sys
13:35:32.766 [trace] Executing silently Code (completed) = import sys as _VSCODE_sys\nprint(_VSCODE_sys.executable); del _VSCODE_sys with 1 output(s)
13:35:32.767 [trace] Started running kernel initialization for ~/Capstone/project-dominicdill/testing.ipynb
13:35:32.767 [trace] Executing silently Code (idle) = try:\nimport ipywidgets as _VSCODE_ipywidgets\nprint("e976ee50-99ed-4aba-9b6b-9dcd5634d07d:IPy
13:35:32.773 [trace] Executing silently Code (completed) = try:\nimport ipywidgets as _VSCODE_ipywidgets\nprint("e976ee50-99ed-4aba-9b6b-9dcd5634d07d:IPy with 0 output(s)
13:35:32.774 [trace] Determined IPyWidgets Version as undefined
13:35:32.775 [trace] Initialize matplotlib for ~/Capstone/project-dominicdill/testing.ipynb
13:35:32.775 [trace] Executing silently Code (idle) = def __VSCODE_inject_module():\ndef __VSCODE_call_function(function, callback, data=None):
13:35:33.382 [trace] Executing silently Code (completed) = def __VSCODE_inject_module():\ndef __VSCODE_call_function(function, callback, data=None): with 0 output(s)
13:35:33.383 [debug] Requesting Kernel info
13:35:33.383 [trace] Got Kernel info
13:35:33.384 [trace] End running kernel initialization, now waiting for idle
13:35:33.384 [trace] Waiting for idle on (kernel): 4c72cb4b-f055-4274-8bac-41436c1a6280 -> idle
13:35:33.385 [trace] Finished waiting for idle on (kernel): 4c72cb4b-f055-4274-8bac-41436c1a6280 -> idle
13:35:33.385 [trace] End running kernel initialization, session is idle
13:35:33.387 [trace] Registering Kernel Completion Provider from kernel playground (Python 3.12.5) for language python
13:35:33.388 [trace] IPyWidgetScriptSource.initialize
13:35:33.389 [info] Kernel successfully started
13:35:33.390 [debug] getDataDirsImpl, Class name = Mr (started execution), Arg 1: "~/Capstone/project-dominicdill/testing.ipynb", Arg 2: "~/miniconda3/envs/playground/bin/python"
13:35:33.391 [debug] Get Custom Env Variables, Class name = Cm, completed in 1ms, has a truthy return value, Arg 1: undefined, Arg 2: "RunPythonCode"
13:35:33.391 [debug] Jupyter Paths : 
13:35:33.394 [info] Process Execution: ~/miniconda3/envs/playground/bin/python /home/~/.vscode-server/extensions/ms-toolsai.jupyter-2024.8.1-linux-x64/pythonFiles/printJupyterDataDir.py
13:35:33.406 [debug] getDataDirsImpl, Class name = Mr, completed in 16ms, has a truthy return value, Arg 1: "~/Capstone/project-dominicdill/testing.ipynb", Arg 2: "~/miniconda3/envs/playground/bin/python", Return Value: <Uri:/home/~/.local/share/jupyter>, <Uri:/home/~/miniconda3/envs/playground/share/jupyter>, <Uri:/usr/local/share/jupyter>, <Uri:/usr/share/jupyter>
13:36:03.049 [debug] Handle Execution of Cells 0,1,2,3,4,5,6,7,8,9,10,11 for ~/Capstone/project-dominicdill/testing.ipynb
13:36:03.051 [trace] Execute Notebook ~/Capstone/project-dominicdill/testing.ipynb. Step 1
13:36:03.052 [trace] Connect to Kernel ~/Capstone/project-dominicdill/testing.ipynb. Step 2
13:36:03.053 [trace] Connected to Kernel ~/Capstone/project-dominicdill/testing.ipynb. Step 3
13:36:03.054 [trace] executeCell 0. Step 4
13:36:03.056 [trace] executeCell 1. Step 4
13:36:03.057 [trace] executeCell 2. Step 4
13:36:03.058 [trace] executeCell 3. Step 4
13:36:03.061 [trace] executeCell 4. Step 4
13:36:03.062 [trace] executeCell 5. Step 4
13:36:03.063 [trace] executeCell 6. Step 4
13:36:03.064 [trace] executeCell 7. Step 4
13:36:03.065 [trace] executeCell 8. Step 4
13:36:03.066 [trace] executeCell 9. Step 4
13:36:03.067 [trace] executeCell 10. Step 4
13:36:03.069 [trace] executeCell 11. Step 4
13:36:03.071 [trace] Cell Index:0 sent to kernel
13:36:03.077 [trace] Start cell 0 execution @ 1726767363077 (clear output)
13:36:03.080 [debug] Kernel acknowledged execution of cell 0 @ 1726767363077
13:36:03.517 [trace] Cell 0 completed in 0.44s (start: 1726767363077, end: 1726767363517)
13:36:03.518 [trace] Cell 0 executed successfully
13:36:03.519 [trace] Cell Index:1 sent to kernel
13:36:03.525 [trace] Start cell 1 execution @ 1726767363525 (clear output)
13:36:03.526 [debug] Kernel acknowledged execution of cell 1 @ 1726767363525
13:36:03.571 [trace] Cell 1 completed in 0.046s (start: 1726767363525, end: 1726767363571)
13:36:03.572 [trace] Cell 1 executed successfully
13:36:03.573 [trace] Cell Index:2 sent to kernel
13:36:03.597 [trace] Start cell 2 execution @ 1726767363597 (clear output)
13:36:03.599 [debug] Kernel acknowledged execution of cell 2 @ 1726767363597
13:36:03.605 [trace] Cell 2 completed in 0.008s (start: 1726767363597, end: 1726767363605)
13:36:03.606 [trace] Cell 2 executed successfully
13:36:03.608 [trace] Cell Index:3 sent to kernel
13:36:03.621 [trace] Start cell 3 execution @ 1726767363621 (clear output)
13:36:03.622 [debug] Kernel acknowledged execution of cell 3 @ 1726767363621
13:36:03.627 [trace] Cell 3 completed in 0.006s (start: 1726767363621, end: 1726767363627)
13:36:03.628 [trace] Cell 3 executed successfully
13:36:03.631 [trace] Cell Index:4 sent to kernel
13:36:03.658 [trace] Start cell 4 execution @ 1726767363657 (clear output)
13:36:03.659 [debug] Kernel acknowledged execution of cell 4 @ 1726767363657
13:36:16.661 [trace] Queue code ms-toolsai.datawrangler-1 from ms-toolsai.datawrangler after 0ms:

def __DW_GET_DF_VARS__():
    # see definition of these in the debug session
    global __DW_LOCALS__
    global __DW_GLOBALS__

    import json
    import builtins
    import sys

    try:
        import IPython
        curr_scope = { **builtins.globals(), **builtins.locals() }
        curr_globals = IPython.get_ipython().user_ns
        hidden_scope = IPython.get_ipython().user_ns_hidden
    except:
        curr_scope = { **__DW_LOCALS__, **__DW_GLOBALS__ }
        curr_globals = __DW_GLOBALS__
        hidden_scope = {}

    supported_dataframe_types = {
        builtins.list: "list",
        builtins.dict: "dictionary",
    }

    if 'numpy' in sys.modules:
        try:
            import numpy as np
            supported_dataframe_types[np.ndarray] = "ndarray"
        except:
            pass

    if 'pandas' in sys.modules:
        try:
            import pandas as pd
            supported_dataframe_types[pd.DataFrame] = "pandas"
            supported_dataframe_types[pd.Series] = "series"
        except:
            pass

    if 'polars' in sys.modules:
        try:
            import polars as pl
            supported_dataframe_types[pl.DataFrame] = "polars"
            supported_dataframe_types[pl.Series] = "polarsSeries"
        except:
            pass

    if 'xarray' in sys.modules:
        try:
            import xarray as xr
            supported_dataframe_types[xr.DataArray] = "dataArray"
        except:
            pass

    if 'tensorflow' in sys.modules:
        try:
            import tensorflow as tf
            supported_dataframe_types[tf.Tensor] = "eagerTensor"
        except:
            pass

    if 'torch' in sys.modules:
        try:
            import torch
            supported_dataframe_types[torch.Tensor] = "tensor"
        except:
            pass

    if 'pyspark' in sys.modules and False:
        try:
            import pyspark
            supported_dataframe_types[pyspark.sql.dataframe.DataFrame] = "pyspark"
        except:
            pass

    try:
        # we still need this due to https://github.com/ipython/ipykernel/issues/795
        import IPython
        capture_output = IPython.utils.capture.capture_output
    except:
        class capture_output():
            def __init__(self):
                import io
                self.stdoutio = io.StringIO()
                self.stderrio = io.StringIO()
            @property
            def stdout(self):
                return self.stdoutio.getvalue()
            @property
            def stderr(self):
                return self.stderrio.getvalue()
            def __enter__(self):
                from contextlib import redirect_stdout, redirect_stderr
                self.redirect_stdout = redirect_stdout(self.stdoutio)
                self.redirect_stderr = redirect_stderr(self.stderrio)
                self.redirect_stdout.__enter__(),
                self.redirect_stderr.__enter__()
                return self
            def __exit__(self, *args, **kwargs):
                result_stdout = self.redirect_stdout.__exit__(*args, **kwargs)
                result_stderr = self.redirect_stderr.__exit__(*args, **kwargs)
                return result_stdout and result_stderr

    with capture_output():
        def is_local(k):
            return k not in curr_globals or curr_scope[k] is not curr_globals[k]
        def valid_keys(k):
            # see https://github.com/ipython/ipython/blob/main/IPython/core/magics/namespace.py#L275
            return not k.startswith('_') and not (k in hidden_scope and not is_local(k))

        variables = sorted(filter(valid_keys, curr_scope.keys()))
        dataframe_vars = {d: curr_scope[d] for d in variables}
        dataframes_with_metadata = [
            {
                "variableName": v,
                "type": supported_dataframe_types[df_type], # use dict key instead of 'type(dataframes_vars[v])' because the type may not exist in the dict
                "supportedEngines": ["pandas"],
                "isLocalVariable": is_local(v),
            }
            for v in variables
            for df_type in supported_dataframe_types.keys()
            if isinstance(dataframe_vars[v], df_type)
        ]

    builtins.print(json.dumps(dataframes_with_metadata))

try:
    __DW_GET_DF_VARS__()
finally:
    del __DW_GET_DF_VARS__
13:37:23.406 [debug] VS Code interrupted kernel for ~/Capstone/project-dominicdill/testing.ipynb
13:37:23.407 [debug] Command interrupted kernel for ~/Capstone/project-dominicdill/testing.ipynb
13:37:23.407 [debug] interrupt the kernel, options.disableUI=false for ~/Capstone/project-dominicdill/testing.ipynb
13:37:23.408 [info] Interrupt kernel execution
13:37:23.408 [debug] Cancel pending cells
13:37:23.409 [trace] Cell 5 completed in 0s (start: undefined, end: undefined)
13:37:23.413 [trace] Cell 6 completed in 0s (start: undefined, end: undefined)
13:37:23.417 [trace] Cell 7 completed in 0s (start: undefined, end: undefined)
13:37:23.424 [trace] Cell 8 completed in 0s (start: undefined, end: undefined)
13:37:23.428 [trace] Cell 9 completed in 0s (start: undefined, end: undefined)
13:37:23.433 [trace] Cell 10 completed in 0s (start: undefined, end: undefined)
13:37:23.443 [trace] Cell 11 completed in 0s (start: undefined, end: undefined)
13:37:23.451 [trace] Execution Id:ms-toolsai.datawrangler-1. Execution cancelled.
13:37:23.454 [trace] Execution Id:ms-toolsai.datawrangler-1. Execution disposed.
13:37:23.460 [info] Interrupt requested ~/Capstone/project-dominicdill/testing.ipynb
13:37:23.476 [info] Disposing request as the cell (-1) was deleted ~/Capstone/project-dominicdill/testing.ipynb
13:37:23.476 [trace] Cell -1 completed in -1726767363.657s (start: 1726767363657, end: undefined)
13:37:23.481 [debug] Execution of code ms-toolsai.datawrangler-1 completed in 66820ms
13:37:23.482 [warn] Cell completed with errors (cancelled)
13:37:23.483 [info] Interrupt kernel execution
13:37:23.484 [info] Interrupting kernel: python3125jvsc74a57bd0180e1a7945f365280e7d6ed923e7a82d1f1f5c227f2dc7f4a8a5be0562907830
13:37:23.484 [info] Interrupting kernel via SIGINT
13:37:23.484 [trace] Cell -1 executed successfully
13:37:23.485 [trace] Cell -1 executed successfully
13:37:23.485 [trace] Cell -1 executed successfully
13:37:23.486 [trace] Cell -1 executed successfully
13:37:23.486 [trace] Cell -1 executed successfully
13:37:23.486 [trace] Cell -1 executed successfully
13:37:23.487 [trace] Cell 11 executed successfully
13:37:23.490 [trace] Cell -1 executed successfully
13:37:23.495 [warn] Cancel all remaining cells due to cancellation or failure in execution
13:37:23.495 [debug] Cancel pending cells
13:37:23.498 [info] Interrupt requested & sent for ~/Capstone/project-dominicdill/testing.ipynb in notebookEditor.
13:37:23.538 [debug] Interpreter for Pylance for Notebook URI "~/Capstone/project-dominicdill/testing.ipynb" is ~/miniconda3/envs/playground/bin/python
13:37:23.982 [warn] Unhandled message found: execute_reply
13:37:25.758 [trace] Queue code ms-toolsai.datawrangler-2 from ms-toolsai.datawrangler after 1ms:

def __DW_GET_DF_VARS__():
    # see definition of these in the debug session
    global __DW_LOCALS__
    global __DW_GLOBALS__

    import json
    import builtins
    import sys

    try:
        import IPython
        curr_scope = { **builtins.globals(), **builtins.locals() }
        curr_globals = IPython.get_ipython().user_ns
        hidden_scope = IPython.get_ipython().user_ns_hidden
    except:
        curr_scope = { **__DW_LOCALS__, **__DW_GLOBALS__ }
        curr_globals = __DW_GLOBALS__
        hidden_scope = {}

    supported_dataframe_types = {
        builtins.list: "list",
        builtins.dict: "dictionary",
    }

    if 'numpy' in sys.modules:
        try:
            import numpy as np
            supported_dataframe_types[np.ndarray] = "ndarray"
        except:
            pass

    if 'pandas' in sys.modules:
        try:
            import pandas as pd
            supported_dataframe_types[pd.DataFrame] = "pandas"
            supported_dataframe_types[pd.Series] = "series"
        except:
            pass

    if 'polars' in sys.modules:
        try:
            import polars as pl
            supported_dataframe_types[pl.DataFrame] = "polars"
            supported_dataframe_types[pl.Series] = "polarsSeries"
        except:
            pass

    if 'xarray' in sys.modules:
        try:
            import xarray as xr
            supported_dataframe_types[xr.DataArray] = "dataArray"
        except:
            pass

    if 'tensorflow' in sys.modules:
        try:
            import tensorflow as tf
            supported_dataframe_types[tf.Tensor] = "eagerTensor"
        except:
            pass

    if 'torch' in sys.modules:
        try:
            import torch
            supported_dataframe_types[torch.Tensor] = "tensor"
        except:
            pass

    if 'pyspark' in sys.modules and False:
        try:
            import pyspark
            supported_dataframe_types[pyspark.sql.dataframe.DataFrame] = "pyspark"
        except:
            pass

    try:
        # we still need this due to https://github.com/ipython/ipykernel/issues/795
        import IPython
        capture_output = IPython.utils.capture.capture_output
    except:
        class capture_output():
            def __init__(self):
                import io
                self.stdoutio = io.StringIO()
                self.stderrio = io.StringIO()
            @property
            def stdout(self):
                return self.stdoutio.getvalue()
            @property
            def stderr(self):
                return self.stderrio.getvalue()
            def __enter__(self):
                from contextlib import redirect_stdout, redirect_stderr
                self.redirect_stdout = redirect_stdout(self.stdoutio)
                self.redirect_stderr = redirect_stderr(self.stderrio)
                self.redirect_stdout.__enter__(),
                self.redirect_stderr.__enter__()
                return self
            def __exit__(self, *args, **kwargs):
                result_stdout = self.redirect_stdout.__exit__(*args, **kwargs)
                result_stderr = self.redirect_stderr.__exit__(*args, **kwargs)
                return result_stdout and result_stderr

    with capture_output():
        def is_local(k):
            return k not in curr_globals or curr_scope[k] is not curr_globals[k]
        def valid_keys(k):
            # see https://github.com/ipython/ipython/blob/main/IPython/core/magics/namespace.py#L275
            return not k.startswith('_') and not (k in hidden_scope and not is_local(k))

        variables = sorted(filter(valid_keys, curr_scope.keys()))
        dataframe_vars = {d: curr_scope[d] for d in variables}
        dataframes_with_metadata = [
            {
                "variableName": v,
                "type": supported_dataframe_types[df_type], # use dict key instead of 'type(dataframes_vars[v])' because the type may not exist in the dict
                "supportedEngines": ["pandas"],
                "isLocalVariable": is_local(v),
            }
            for v in variables
            for df_type in supported_dataframe_types.keys()
            if isinstance(dataframe_vars[v], df_type)
        ]

    builtins.print(json.dumps(dataframes_with_metadata))

try:
    __DW_GET_DF_VARS__()
finally:
    del __DW_GET_DF_VARS__
13:37:25.762 [trace] Execution Id:ms-toolsai.datawrangler-2. Start Code execution.
13:37:25.762 [trace] Execution Id:ms-toolsai.datawrangler-2. Send code for execution.
13:37:25.763 [trace] Execution Id:ms-toolsai.datawrangler-2. Execution Request Sent to Kernel.
13:37:25.781 [trace] Execution Id:ms-toolsai.datawrangler-2. Executed successfully.
13:37:25.782 [debug] Execution of code ms-toolsai.datawrangler-2 completed in 25ms
13:37:27.456 [trace] Queue code ms-toolsai.datawrangler-3 from ms-toolsai.datawrangler after 0ms:

def __DW_GET_DF_VARS__():
    # see definition of these in the debug session
    global __DW_LOCALS__
    global __DW_GLOBALS__

    import json
    import builtins
    import sys

    try:
        import IPython
        curr_scope = { **builtins.globals(), **builtins.locals() }
        curr_globals = IPython.get_ipython().user_ns
        hidden_scope = IPython.get_ipython().user_ns_hidden
    except:
        curr_scope = { **__DW_LOCALS__, **__DW_GLOBALS__ }
        curr_globals = __DW_GLOBALS__
        hidden_scope = {}

    supported_dataframe_types = {
        builtins.list: "list",
        builtins.dict: "dictionary",
    }

    if 'numpy' in sys.modules:
        try:
            import numpy as np
            supported_dataframe_types[np.ndarray] = "ndarray"
        except:
            pass

    if 'pandas' in sys.modules:
        try:
            import pandas as pd
            supported_dataframe_types[pd.DataFrame] = "pandas"
            supported_dataframe_types[pd.Series] = "series"
        except:
            pass

    if 'polars' in sys.modules:
        try:
            import polars as pl
            supported_dataframe_types[pl.DataFrame] = "polars"
            supported_dataframe_types[pl.Series] = "polarsSeries"
        except:
            pass

    if 'xarray' in sys.modules:
        try:
            import xarray as xr
            supported_dataframe_types[xr.DataArray] = "dataArray"
        except:
            pass

    if 'tensorflow' in sys.modules:
        try:
            import tensorflow as tf
            supported_dataframe_types[tf.Tensor] = "eagerTensor"
        except:
            pass

    if 'torch' in sys.modules:
        try:
            import torch
            supported_dataframe_types[torch.Tensor] = "tensor"
        except:
            pass

    if 'pyspark' in sys.modules and False:
        try:
            import pyspark
            supported_dataframe_types[pyspark.sql.dataframe.DataFrame] = "pyspark"
        except:
            pass

    try:
        # we still need this due to https://github.com/ipython/ipykernel/issues/795
        import IPython
        capture_output = IPython.utils.capture.capture_output
    except:
        class capture_output():
            def __init__(self):
                import io
                self.stdoutio = io.StringIO()
                self.stderrio = io.StringIO()
            @property
            def stdout(self):
                return self.stdoutio.getvalue()
            @property
            def stderr(self):
                return self.stderrio.getvalue()
            def __enter__(self):
                from contextlib import redirect_stdout, redirect_stderr
                self.redirect_stdout = redirect_stdout(self.stdoutio)
                self.redirect_stderr = redirect_stderr(self.stderrio)
                self.redirect_stdout.__enter__(),
                self.redirect_stderr.__enter__()
                return self
            def __exit__(self, *args, **kwargs):
                result_stdout = self.redirect_stdout.__exit__(*args, **kwargs)
                result_stderr = self.redirect_stderr.__exit__(*args, **kwargs)
                return result_stdout and result_stderr

    with capture_output():
        def is_local(k):
            return k not in curr_globals or curr_scope[k] is not curr_globals[k]
        def valid_keys(k):
            # see https://github.com/ipython/ipython/blob/main/IPython/core/magics/namespace.py#L275
            return not k.startswith('_') and not (k in hidden_scope and not is_local(k))

        variables = sorted(filter(valid_keys, curr_scope.keys()))
        dataframe_vars = {d: curr_scope[d] for d in variables}
        dataframes_with_metadata = [
            {
                "variableName": v,
                "type": supported_dataframe_types[df_type], # use dict key instead of 'type(dataframes_vars[v])' because the type may not exist in the dict
                "supportedEngines": ["pandas"],
                "isLocalVariable": is_local(v),
            }
            for v in variables
            for df_type in supported_dataframe_types.keys()
            if isinstance(dataframe_vars[v], df_type)
        ]

    builtins.print(json.dumps(dataframes_with_metadata))

try:
    __DW_GET_DF_VARS__()
finally:
    del __DW_GET_DF_VARS__
13:37:27.456 [trace] Execution Id:ms-toolsai.datawrangler-3. Start Code execution.
13:37:27.457 [trace] Execution Id:ms-toolsai.datawrangler-3. Send code for execution.
13:37:27.457 [trace] Execution Id:ms-toolsai.datawrangler-3. Execution Request Sent to Kernel.
13:37:27.475 [trace] Execution Id:ms-toolsai.datawrangler-3. Executed successfully.
13:37:27.475 [debug] Execution of code ms-toolsai.datawrangler-3 completed in 19ms
13:37:28.875 [trace] Queue code ms-toolsai.datawrangler-4 from ms-toolsai.datawrangler after 0ms:

def __DW_GET_DF_VARS__():
    # see definition of these in the debug session
    global __DW_LOCALS__
    global __DW_GLOBALS__

    import json
    import builtins
    import sys

    try:
        import IPython
        curr_scope = { **builtins.globals(), **builtins.locals() }
        curr_globals = IPython.get_ipython().user_ns
        hidden_scope = IPython.get_ipython().user_ns_hidden
    except:
        curr_scope = { **__DW_LOCALS__, **__DW_GLOBALS__ }
        curr_globals = __DW_GLOBALS__
        hidden_scope = {}

    supported_dataframe_types = {
        builtins.list: "list",
        builtins.dict: "dictionary",
    }

    if 'numpy' in sys.modules:
        try:
            import numpy as np
            supported_dataframe_types[np.ndarray] = "ndarray"
        except:
            pass

    if 'pandas' in sys.modules:
        try:
            import pandas as pd
            supported_dataframe_types[pd.DataFrame] = "pandas"
            supported_dataframe_types[pd.Series] = "series"
        except:
            pass

    if 'polars' in sys.modules:
        try:
            import polars as pl
            supported_dataframe_types[pl.DataFrame] = "polars"
            supported_dataframe_types[pl.Series] = "polarsSeries"
        except:
            pass

    if 'xarray' in sys.modules:
        try:
            import xarray as xr
            supported_dataframe_types[xr.DataArray] = "dataArray"
        except:
            pass

    if 'tensorflow' in sys.modules:
        try:
            import tensorflow as tf
            supported_dataframe_types[tf.Tensor] = "eagerTensor"
        except:
            pass

    if 'torch' in sys.modules:
        try:
            import torch
            supported_dataframe_types[torch.Tensor] = "tensor"
        except:
            pass

    if 'pyspark' in sys.modules and False:
        try:
            import pyspark
            supported_dataframe_types[pyspark.sql.dataframe.DataFrame] = "pyspark"
        except:
            pass

    try:
        # we still need this due to https://github.com/ipython/ipykernel/issues/795
        import IPython
        capture_output = IPython.utils.capture.capture_output
    except:
        class capture_output():
            def __init__(self):
                import io
                self.stdoutio = io.StringIO()
                self.stderrio = io.StringIO()
            @property
            def stdout(self):
                return self.stdoutio.getvalue()
            @property
            def stderr(self):
                return self.stderrio.getvalue()
            def __enter__(self):
                from contextlib import redirect_stdout, redirect_stderr
                self.redirect_stdout = redirect_stdout(self.stdoutio)
                self.redirect_stderr = redirect_stderr(self.stderrio)
                self.redirect_stdout.__enter__(),
                self.redirect_stderr.__enter__()
                return self
            def __exit__(self, *args, **kwargs):
                result_stdout = self.redirect_stdout.__exit__(*args, **kwargs)
                result_stderr = self.redirect_stderr.__exit__(*args, **kwargs)
                return result_stdout and result_stderr

    with capture_output():
        def is_local(k):
            return k not in curr_globals or curr_scope[k] is not curr_globals[k]
        def valid_keys(k):
            # see https://github.com/ipython/ipython/blob/main/IPython/core/magics/namespace.py#L275
            return not k.startswith('_') and not (k in hidden_scope and not is_local(k))

        variables = sorted(filter(valid_keys, curr_scope.keys()))
        dataframe_vars = {d: curr_scope[d] for d in variables}
        dataframes_with_metadata = [
            {
                "variableName": v,
                "type": supported_dataframe_types[df_type], # use dict key instead of 'type(dataframes_vars[v])' because the type may not exist in the dict
                "supportedEngines": ["pandas"],
                "isLocalVariable": is_local(v),
            }
            for v in variables
            for df_type in supported_dataframe_types.keys()
            if isinstance(dataframe_vars[v], df_type)
        ]

    builtins.print(json.dumps(dataframes_with_metadata))

try:
    __DW_GET_DF_VARS__()
finally:
    del __DW_GET_DF_VARS__
13:37:28.878 [trace] Execution Id:ms-toolsai.datawrangler-4. Start Code execution.
13:37:28.879 [trace] Execution Id:ms-toolsai.datawrangler-4. Send code for execution.
13:37:28.881 [trace] Execution Id:ms-toolsai.datawrangler-4. Execution Request Sent to Kernel.
13:37:28.908 [trace] Execution Id:ms-toolsai.datawrangler-4. Executed successfully.
13:37:28.909 [debug] Execution of code ms-toolsai.datawrangler-4 completed in 34ms
amunger commented 1 day ago

This might be due to saving and the file-watcher not handling the change on disk appropriately. I would need the logs from core to determine that though, could you:

  1. set the log level to trace with Developer: set log level
  2. repro the issue
  3. collect the logs from the output window for
    1. Window (just around the time of the repro since it will be a lot)
    2. Noteook
    3. and Jupyter again
dominicdill commented 1 day ago

I will work on doing that. Could you give me more detailed info on how show the logs? What exactly do I type into the command palette to initiate or show each log? image

amunger commented 1 day ago

it's just in the output panel at the bottom, select each of those items from the dropdown to get the different logs

image