Open nikfio opened 1 year ago
Hi @nikfio thank you for the feedback! Could it be possible for you to update to our latest release (Spyder 5.3.2) and check again? Also, could you share some example code that enables us to reproduce this in our side? Any new info is greatly apprecited Let us know!
Hello @dalthviz, thank you for your reply. I have updated Spyder to version 5.3.2 as ou suggested, but the problem keeps occurring. I'll try to expand. Basically I'm testing my code with breakpoints, at breakpoint reached I have this situation
Then I click the 'save data as' button just above the 'variable explorer' and the saving operation crashes.
The crash shows a sort of error traceback on the IPdb console.
Let me know If you need more information.
Thanks Nick
@dalthviz I'm doing a deeper investigation on this issue.
Here's a larger screen of the IPnb console with the save data crash occurred
You can see a lot of rows like this
File "C:\python_envs\Systems_analysis_n_prediction\lib\site-packages\dotty_dict\dotty_dict.py", line 89 in __getattr__
I'm currently using a library called dotty dict (https://dotty-dict.readthedocs.io/en/latest/dotty_dict.html). It really seems like that instancing a dotty object
from dotty_dict import dotty
test_dotty = dotty()
makes spyder save data operation going nuts.
Avoiding this instance, data saving goes ok. It just gives out this pop up which I think it is expected (?)
Indeed the self
object is not saved but all the other variables are fine.
Hope this debugging could better help you.
Nick
Hi @nikfio tank you for the new info! I was able to reproduce this by running the dotty example code. I got the same message in the console before it restarts:
Windows fatal exception: stack overflow
Main thread:
Thread 0x000017e4 (most recent call first):
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\selectors.py", line 315 in _select
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\selectors.py", line 324 in select
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\asyncio\base_events.py", line 1860 in _run_once
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\asyncio\base_events.py", line 600 in run_forever
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\site-packages\tornado\platform\asyncio.py", line 215 in start
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\site-packages\ipykernel\kernelapp.py", line 712 in start
File "e:\acer\documentos\spyder\spyder\external-deps\spyder-kernels\spyder_kernels\console\start.py", line 334 in main
File "e:\acer\documentos\spyder\spyder\external-deps\spyder-kernels\spyder_kernels\console\__main__.py", line 24 in <module>
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\runpy.py", line 86 in _run_code
File "C:\Users\dalth\anaconda3\envs\spyder-cf\lib\runpy.py", line 196 in _run_module_as_main
Restarting kernel...
Also, as I think you mention, checking the Exclude unsupported data types
before using the saving functionality prevents the error (although it also prevents saving some variables).
Also, checking the traceback this seems similar to the one at https://github.com/spyder-ide/spyder/issues/18930
Seems like having a custom __getattr__
doesn't play well with jupyter kernels (used by Spyder but also Jupyter notebooks and other jupyter-based tooling).
Maybe we should ensure saving just data that is supported (even when the option Exclude unsupported data types
isn't checked) @ccordoba12 ?
Maybe we should ensure saving just data that is supported (even when the option Exclude unsupported data types isn't checked) @ccordoba12 ?
Yeah, I think that's a good idea. We could also try to switch to Dill or Cloudpickle as our libraries to save the current workspace because they do it in a more robust way.
Seems like having a custom
__getattr__
doesn't play well with jupyter kernels (used by Spyder but also Jupyter notebooks and other jupyter-based tooling).
I looked into this a bit when i submitted #18905. It's not that custom __getattr__
implementations don't work with Spyder. They just have to be really robustly implemented in order to prevent infinite recursion as described in #18930. And they also need to be well behaved (e.g. throwing AttributeError
on non-existent attributes).
It seems like Spyder will sometimes query (possibly non-existent) attributes like some_var.shape
or similar on session variables, so the getattr implementation has to be ready for for this (e.g. not throwing an AttributeError
will cause problems, if I remember correctly). Maybe something similar is happening when saving the workspace?
Thanks @ncbloch for the feedback! Checking a little bit I think to save dict variables and variables in general we try to pickle them
which maybe could be causing the issue here, but yep what you said makes a lot of sense :+1: at the end Spyder for the Variable Explorer and IPython Console functionality tries to get the attributtes from the variables in a way that could cause a not so robust __getattr__
implementations to end up raising a RecursionError
(which with the default interpreter causes a stackoverflow on Windows as we described at #18930)
Description
What steps will reproduce the problem?
tried to save a workspace in a .spydata file. Failed. Various data types inside, str pandas bool and so on
Traceback
Versions
Dependencies