PyProphet / pyprophet

PyProphet: Semi-supervised learning and scoring of OpenSWATH results.
http://www.openswath.org
BSD 3-Clause "New" or "Revised" License
29 stars 21 forks source link

Export compounds scored on ms1 level (SCORE_MS1) #75

Closed oliveralka closed 4 years ago

oliveralka commented 4 years ago

ADD: MS1 to export_compound (either ms1 or ms1ms2, ms2) ADD: New test with data for ms1, ms1ms2, ms2 scoring (test uses ms1 or ms2) RM: Additional plot for compounds (was never used and standard plotting works as expected)

@grosenberger Please let me know what you think and if you would like to incorporate it in a different way.

PS: I guess in the future usually ms1ms2 scoring (SCORE_MS2) is used for compounds, but If people would like to test how their decoy model performs. I think it is really helpful to have MS1 "only" scoring (SCORE_MS1) available as export and plotting as well.

oliveralka commented 4 years ago

Travis failed with: error: The 'hyperopt' distribution was not found and is required by pyprophet

grosenberger commented 4 years ago

Looks great, thank you!

grosenberger commented 4 years ago

Hmm, it seems I was a bit too quick with merging. There still seems to be an incompatibility with the regression tests. Could you please check?

oliveralka commented 4 years ago

Edit: I will take a look and get back to you.

oliveralka commented 4 years ago

@grosenberger I think it has nothing to do with this PR

Python 3.7.4, pyprophet, version 2.1.4

pip freeze | grep hyperopt
hyperopt==0.2.1

tests/test_pyprophet_score.py ..........F. [ 83%]

    def test_osw_5(tmpdir, regtest):
>       _run_pyprophet_osw_to_learn_model(regtest, tmpdir.strpath, True, False, True, pi0_lambda="0 0 0", ms1ms2=True, xgboost=True, xgboost_tune=True)

Command 'pyprophet score --in=test_data.osw --level=ms1 --test --ss_iteration_fdr=0.02 --pfdr --pi0_lambda=0 0 0 --classifier=XGBoost --xgb_autotune score --in=test_data.osw --level=ms1ms2 --test --ss_iteration_fdr=0.02 --pfdr --pi0_lambda=0 0 0 --classifier=XGBoost --xgb_autotune score --in=test_data.osw --level=transition --test --ss_iteration_fdr=0.02 --pfdr --pi0_lambda=0 0 0 --classifier=XGBoost --xgb_autotune' returned non-zero exit status 1.

I get the following error when running the command:

Info: Learning on cross-validation fold.
Info: Learning on cross-validation fold.
Info: Learning on cross-validation fold.
Info: Learning on cross-validation fold.
Info: Learning on cross-validation fold.
Info: Finished learning.
Info: Learning on cross-validated scores.
Info: Autotuning of XGB hyperparameters.
--------------------------------------------------------------------------------                                                                           
CANT ENCODE                                                                                                                                                
--------------------------------------------------------------------------------         
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/bin/pyprophet", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/click/core.py", line 1163, in invoke
    rv.append(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/main.py", line 93, in score
    PyProphetLearner(infile, outfile, classifier, xgb_hyperparams, xgb_params, xgb_params_space, xeval_fraction, xeval_num_iter, ss_initial_fdr, ss_iteration_fdr, ss_num_iter, ss_main_score, group_id, parametric, pfdr, pi0_lambda, pi0_method, pi0_smooth_df, pi0_smooth_log_pi0, lfdr_truncate, lfdr_monotone, lfdr_transformation, lfdr_adj, lfdr_eps, level, ipf_max_peakgroup_rank, ipf_max_peakgroup_pep, ipf_max_transition_isotope_overlap, ipf_min_transition_sn, tric_chromprob, threads, test).run()
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/runner.py", line 253, in run
    (result, scorer, weights) = self.run_algo()
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/runner.py", line 415, in run_algo
    (result, scorer, weights) = PyProphet(self.classifier, self.xgb_hyperparams, self.xgb_params, self.xgb_params_space, self.xeval_fraction, self.xeval_num_iter, self.ss_initial_fdr, self.ss_iteration_fdr, self.ss_num_iter, self.group_id, self.parametric, self.pfdr, self.pi0_lambda, self.pi0_method, self.pi0_smooth_df, self.pi0_smooth_log_pi0, self.lfdr_truncate, self.lfdr_monotone, self.lfdr_transformation, self.lfdr_adj, self.lfdr_eps, self.tric_chromprob, self.threads, self.test).learn_and_apply(self.table)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/pyprophet.py", line 255, in learn_and_apply
    result, scorer, trained_weights = self._learn_and_apply(table)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/pyprophet.py", line 263, in _learn_and_apply
    final_classifier = self._learn(experiment)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/pyprophet.py", line 314, in _learn
    model = learner.learn_final(experiment)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/semi_supervised.py", line 82, in learn_final
    params, clf_scores = self.tune_semi_supervised_learning(experiment)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/semi_supervised.py", line 152, in tune_semi_supervised_learning
    self.inner_learner.tune(td_peaks, bt_peaks, True)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/pyprophet/classifiers.py", line 132, in tune
    best_complexity = fmin(fn=objective, space=xgb_params_complexity, algo=tpe.suggest, max_evals=self.xgb_hyperparams['autotune_num_rounds'], rstate=np.random.RandomState(42))
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/hyperopt/fmin.py", line 509, in fmin
    rval.exhaust()
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/hyperopt/fmin.py", line 330, in exhaust
    self.run(self.max_evals - n_done, block_until_done=self.asynchronous)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/hyperopt/fmin.py", line 270, in run
    self.trials.insert_trial_docs(new_trials)
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/hyperopt/base.py", line 455, in insert_trial_docs
    docs = [self.assert_valid_trial(SONify(doc)) for doc in docs]
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/hyperopt/base.py", line 455, in <listcomp>
    docs = [self.assert_valid_trial(SONify(doc)) for doc in docs]
  File "/usr/local/miniconda3/envs/npy37_pyprophet_test/lib/python3.7/site-packages/hyperopt/base.py", line 421, in assert_valid_trial
    bson.BSON.encode(trial)
AttributeError: module 'bson' has no attribute 'BSON'

see https://github.com/hyperopt/hyperopt/issues/547

Did not work as well with hyperopt==0.2.3. Can you reproduce this?

grosenberger commented 4 years ago

Yes, that's correct, I'll try to fix it. Thank you!

oliveralka commented 4 years ago

Thanks! Let me know if I can test something along the way.