benbowen / thoughts

Ideas and thoughts on data analysis and interpretation
1 stars 2 forks source link

Importing Dependencies - Missing File in 'metatlas.helpers' #1

Closed Roli-Wilhelm closed 6 years ago

Roli-Wilhelm commented 6 years ago

Hi Ben,

I didn't have much trouble installing the Metatlas module in a new conda environment. I can import the whole module, but when I run the first import call ("from metatlas.helpers import dill2plots as dp") I get the following error:

No such file or directory: '/project/projectdirs/metatlas/mysql_user.txt'

The goods news is that is an obvious problem, the bad news is... I don't know what a mysql_user.txt file is supposed to contain. Please advise.

Thanks in advance for your help, Roli

p.s. I'm obviously just getting to this now. It has been a crazy past month, but I can dedicate myself to analyzing this data now. Also, there was a great talk on a SIP-proteomic software near the end of the JGI meeting by Manuel Kleiner. It looks like they built code to quantify 13C-enrichment for peptides in a manner that could work for metabolites. We won't be able to use a reference to correct (since we don't have the luxury of plucking a hair that contains our amino acids), but we could grift their method for calculating delta-13C (starts on line 402 in their bioRxiv preprint found HERE)

Roli-Wilhelm commented 6 years ago

You can see my notebook HERE

benbowen commented 6 years ago

I did see his talk, and I mentioned to him afterwards that we did isotope fitting in metaproteomics quite a long time ago (using two different methods). https://pubs.acs.org/doi/10.1021/cb400210q

Likely because of the title of our manuscript, the metaproteomic isotope methods get overlooked.

Can you comment out those imports? I'm trying to remember, but I don't think you need any of the database functionality for your notebook.

I'll be able to look at this more closely in a few days.

On Tue, May 1, 2018 at 9:48 AM, Roli Wilhelm notifications@github.com wrote:

You can see my notebook HERE https://github.com/Roli-Wilhelm/SIP-metabolomics

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/benbowen/thoughts/issues/1#issuecomment-385720930, or mute the thread https://github.com/notifications/unsubscribe-auth/AAxfycld4xjR9yNtf1FL8Y_733MJlIYXks5tuJHLgaJpZM4TuLHI .

Roli-Wilhelm commented 6 years ago

Thanks for sharing the pub. Any insights from that work that we can apply to calculating delta-13C from our metabolites?

Regarding the code, has metatlas been run independently of your lab before? As for the specifics, the object 'dp' is called in the next cell, so the error thrown from "import dill2plots as dp" is critical. The error may stem from the expectation the code is being run on NERSC. One of the errors (which you can see through the github link) is the following:

with open(nersc_info['db_passwd_file']) as fid: pw = fid.read().strip() self.path = 'mysql+pymysql://meta_atlas_admin:%s@nerscdb04.nersc.gov/%s' % (pw, nersc_info['db_name'])

benbowen commented 6 years ago

Several times, but why are you trying to set it up? You shouldn't need it. To set it up, you would need a path for raw data (you would need raw data too), you need a mysql database, you need to populate the db with compounds, atlases, lcms run information, etc.

The notebook for analyzing your data shouldn't have any metatlas dependencies.

The fractional enrichment calculation is the amount enriched divided by the total amount.

Maybe a screenshare conference call would a solution to get rolling with this? I'm guessing you are using an old notebook that still had metatlas dependencies.

Roli-Wilhelm commented 6 years ago

That makes it easy! I assumed I had to scrape the data myself since you'd sent me code for it ('Workflow Notebook - Build Isotope Dataset'). If you could send me both the "positive_mode_isotope_data.pkl" and "negative_mode_isotope_data.pkl", then I could avoid messing with metatlas.

No need for a conference call just yet and thanks for your quick responses.