Closed mayeshh closed 1 year ago
appveyor.yml
and .travis.yml
so tests can run on this PR. To be clear, you can add a new commit to this PR editing appveyor.yml
and .travis.yml
and specifying pandas<=xxx
where xxx is a previous pandas version.I have done a small test comparing norm.pdf
to the handwritten function normpdf
. I am pretty certain that norm.pdf
scales correctly and that the functions are essentially identical. See norm_pdf_unit_test.pdf.
Should I try to add a test to test_burstlib.py
? Or where would the test go?
@mayeshh, yes test_burstlib.py will work.
I put the test in test_exp_fitting.py
because it seemed more relevant. I ran pytest -v
and it passed, however others failed: 61 passed, 17 warnings, 45 error in 15.13s
.
Is this expected?
All tests should run with no errors. Some warning are possible but no test fail. All test pass online as you ca see from the green checkmarks.
will you merge this PR?
@mayeshh, yes, I'll merge it. Please fix the test. It seems you duplicated an old test and gave it a new name. Instead, inside the test function you should define the function normpdf
and then test it with an assert like this:
assert np.allclose(normpdf(x, mu=mu, sigma=sigma), norm.pdf(x, loc=mu, scale=sigma))
for some mu and sigma != 1. Let me know if you have questions or doubts.
Also, please put the test in test_burstlib.py. The file you use is specific for exponential fittings used for background estimation.
I am not sure how to fix the failed checks... I also have some questions:
@pytest.fixture(scope="module")
before the function?
(I am not sure what the @pytest.fixture
decorator does)normpdf
function to the correct spot in the file?x
first, or is it enough to just
define the function?Perfect. To answer your questions:
Regarding the decorator @pytest.fixture
, no you don't need to add it. You may ask what the heck pytest "fixtures" are? If you look at the test file there are 3 types of test functions:
data
as input: these take the data passed as input and run the test. The argument data
is the "fixture" defined at the beginning of the test file. It tells pytest to call that test multiple times passing different arguments each time. In this case the same test is called for single-spot (data_1ch
) and multi-spot (data_8ch
) data, avoiding code repetitions.data_8ch
or data_1ch
: these functions take a specific dataset as input, do they are run only once.
(for more info check pytest docs)Yes, you did add normpdf in the right place, thanks.
Yes you need to define x
, as you did already. Just move that line inside the test function as it is only relevant to it.
I also have a question. I see you added np.fromiter
in a few places. Is it addressing a warning or error? If yes, can you please report the warning/error here for future records.
After you move the x
line I'll merge this PR, thanks.
I had a look at the failing tests:
> new_size = np.sum(np.fromiter((d.mburst[ich].num_bursts for d in d_list)))
E TypeError: Required argument 'dtype' (pos 2) not found
fretbursts\burstlib_ext.py:796: TypeError
Also, there is a DeprecationWarning:
test_burstlib.py:317: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.fromiter(generator)) or the python sum builtin instead.
bg_t = [np.sum(data.bg[s][ich] for s in streams) for ich in range(data.nch)]
Maybe you are trying to fix this warning and now you have the error? I think the easiest fix may be using the second option suggested in the warning, i.e. replace np.sum
with sum
(reverting back the np.fromiter addition). The solution you tried should also work but, according to the error message you are also required to pass a second argument to np.fromiter
called dtype
.
Yes, I was trying to fix the deprecation warning. I replaced with sum
as you suggested. I will submit an issue for the record.
The last commit 5ac3153 failed becuase I did not define local variables in test_normpdf
.
I don't understand how the test test_burst_selection_nocorrections
taking load_dataset_1ch
and load_dataset_8ch
are causing the travis-ci test to fail now... I see that there are runtime warnings but I have not changed anything in this code, so I am not sure why this fails now.
@mayeshh, I agree this last test fail is weird. All test passed for python 3.6 on Appveyor (windows) at least. I restarted the tests in case it was only a temporary fluke.
Did you guys get anywhere with this? I'd be keen to help out in some way if I can because people in my lab keep running into difficulties trying to install older versions of fretbursts/matplotlib that work properly.
I couldn't solve the failed checks... seems to be something to do with python 3.7, maybe @tritemio has some insight?
I don't have time to look into it. You can open a PR and fix the issue. For the merge, I am willing to grant commit rights to anybody who wants to manage the repo.
Okay I will pick up from where mayeshh left off and see if I can get to the bottom of it.
In the meantime can you advise on the best way to install an older version of fretbursts (so we can just use a downgraded matplotlib)? I cannot get it to work consistently, and this is also important for reproducibility anyway
@BenjaminAmbrose, in my papers I generally save a conda env with all the packages and package versions used, not only FRETBursts.
You can export the conda env with conda env export > environment.yml
. Note that these files are not necessarily multi-platform so I sometimes save several version of the env, one for each platform. The nice bonus of doing that is that including the env file in your git repo will make it auto-magically executable on the cloud using mybinder. So you will have that env always avaliable when you need it even though you don't have it installed anymore (not mentioning that anybody can ruun your analysis on the cloud).
As secondary measure, I generally print the versions of the packages used at the beginning of each notebook. This is less robust and redundant if you save the full env, but is an additional layer of safety.
Now, if you did some analysis 1-2 years ago and didn't save the env it would be much harder now after the fact to recover the env. Installing an old version of FRETBursts will likely require install old version of numpy, scipy matplotlib ... you need to figure out which versions were availbale at that time.
I don't have time to look into it. You can open a PR and fix the issue. For the merge, I am willing to grant commit rights to anybody who wants to manage the repo.
@BenjaminAmbrose do you want commit rights? If not, it seems like I might be the only other option... I am not an expert and I would happily relinquish commit rights to someone more experienced that wants to maintain the repo.
I'm not sure I'm any more of an expert than you are, managing these build check things is new to me. I don't know how useful it would be for me to be in charge of it
On Thu, 7 May 2020 at 19:38, mayeshh notifications@github.com wrote:
I don't have time to look into it. You can open a PR and fix the issue. For the merge, I am willing to grant commit rights to anybody who wants to manage the repo.
@BenjaminAmbrose https://github.com/BenjaminAmbrose do you want commit rights? If not, it seems like I might be the only other option... I am not an expert and I would happily relinquish commit rights to someone more experienced that wants to maintain the repo.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/OpenSMFS/FRETBursts/pull/33#issuecomment-625427694, or unsubscribe https://github.com/notifications/unsubscribe-auth/AI24HXW2K2FIUP4BJGR73RLRQL5ZBANCNFSM4JIBNC2Q .
Since these same changes have been made in my other pull requests, I'm closing this pull request.
Attempting to fix the issue reported in #30.
Error seems to be resolved, but I now get the following AttributeError:
Traceback (most recent call last)
<ipython-input-2-e7d4478a60c7> in <module>
----> 1 from fretbursts import *
AttributeError: module 'fretbursts' has no attribute 'bg_cache'
I see that the checks have failed, and I am doing my best to understand them. The offending code returns a
ValueError
that does not arise from any changes I have made.