Closed katduecker closed 3 weeks ago
Just some flake8 errors and one comment to address but otherwise this is about ready to merge! Feel free to add [MRG] to the title when you're ready for a final code review
@katduecker can you add a tiny test to check that the deprecation warning is raised? Also, do any of our old tests or examples use tmin
or tmax
? We should update them if they do
@katduecker can you add a tiny test to check that the deprecation warning is raised? Also, do any of our old tests or examples use
tmin
ortmax
? We should update them if they do
@jasmainak thanks for bringing this up! Just trying this out locally it seems that a DeprecationWarning is suppressed in Jupyter Notebook, which I just found out might be the default in many python environments.
Options I found to fix this were to either use warnings.simplefilter('always', DeprecationWarning) at the beginning of the script (probably not advisable?) or use UserWarning instead of DeprecationWarning. There may be a DeprecationWarning used in other functions, so I add that fix there, too. What do you think?
@katduecker you should become familiar with pytest
framework. See:
the rest of the page is informative too. You'll see different ways in which to capture/check warnings.
Jupyter notebook is not an ideal environment for coding ... it's good for data exploration but it is not ideal for coding. I'd recommend using VS Code (or other editor) + terminal. The way to run your tests is to do:
$ pytest .
then if you want to do post-mortem debugging, you can do:
$ pytest . --pdb
etc.
@jasmainak sorry, it seems that I didn't express myself clearly. My question was if I should use UserWarning instead of DeprecationWarning, in case users are working in a python environment that suppresses DeprecationWarnings, as seems to be the default in jupyter notebook, for instance. I am working in VS code.
oh I see, interesting question! I didn't know that. A bit of digging revealed that FutureWarnings are now recommended over DeprecationWarning. One is for developers, the other for users ... we might want to make this change repo-wide. Can be your next PR!
Okay, I'll add the pytest for the deprecation warning for now, and then we can decide where to go from there. Should the deprecation test be added to test_viz.py?
yep, add it to test_viz.py
@katduecker one last comment and then you can update whats_new.rst
!
@jasmainak @ntolley hey, I'm not sure why the tests were canceled because they run locally? any way tor resolve this?
you have a rebase issue ... I see 126 commits!
It was all working in commit 59f1620. I then struggled to rebase because it seems that a few lines were added to whats_new.rst just before I did. I was struggling a bit to get it right (that was 3 commits), but the rebase was eventually successful for 3c422b5 and whats_new.rst looks sensible now.
@katduecker the tests failing is a separate issue we're working on so no worries!
Also I think there's some confusion on what the rebase issue is. @jasmainak is referring to the fact that a large number of commits from other PR's are included here. If you look at the "Files Changed" the majority of these changes are not ones you made (e.g. mine and George's commits from another PR that was merged into the master
branch);
I'm not sure which commands were run, but this typically happens when git rebase
is used incorrectly.
To avoid this in the future it's best to verify that the rebase (and merge conflict resolution) was successful by running git log
locally, and verifying that 1) the most recent commits are exclusively ones from your PR, and 2) the commits before your PR match exactly the ones from the master branch. If something went wrong, then you can pull the last version from github and try again.
Now that these changes are pushed to github @jasmainak is the best option to reset and cherry pick these commits or something?
honestly given the rebase issues ... I feel the easiest is to copy the edits into a new branch and make commits on that branch. If you'd like to give credit to the previous contributors, you can use co-authored commits. Then rename the branch to tstop_plot
and force push.
Sorry @katduecker totally understand that rebase issues are frustrating to understand/fix
The skills for resolving them are super useful for code development, and you'll get a lot of street cred as a git superuser :smile:
Thanks for clarifying, both! I'll do what Mainak said and copy the edits into a new branch. And thanks for clarifying what the rebase issue was! Do I need to wait until the problem with the tests failing is resolved?
No need to wait on the tests being fixed! It's an issue specific to the github servers that run the tests, but only the ubuntu tests are impacted
Your local tests (and the unit tests for Mac and Windows) are still passing so you're all good from a unit testing stand point
This pull request supersedes PR#752
The x/time-axis of all plots now ends at tstop. Deprecation cycle for tmin and tmax added to plot_laminar_lfp, plot_dipole. tmin and tmax are still required for plot_psd and plot_tfr_morlet.