caporaso-lab / mockrobiota

A public resource for microbiome bioinformatics benchmarking using artificially constructed (i.e., mock) communities.
http://mockrobiota.caporasolab.us
BSD 3-Clause "New" or "Revised" License
77 stars 35 forks source link

correct the precision in expected taxonomy files #3

Closed gregcaporaso closed 8 years ago

gregcaporaso commented 8 years ago

@nbokulich, when @jairideout and I were testing this we discovered that the relative abundances are off in some of the expected taxonomy files. We wanted to confirm that the taxa abundances in each sample sum to 1.0 (to seven decimal places, the unitetest.assertFloatEqual default). We found that in some cases they're not equal even to two decimal places. For example, in mock-3, the sum of the values in sample HMPMockV1.2.Staggered1 is 1.02. Would you be able to look into this? We have a test file that you can run now which will help you identify these samples (run python tests/check_data_integrity.py in this repository - this is Python 3 only).

Note that this issue is causing the current build to fail - I think it's important to have this block people from using the data for now.

nbokulich commented 8 years ago

sounds like this is an issue with the rounding... I will look into this. Thanks for noticing!

Is this just an issue with greengenes, Silva, or both?

On Wed, May 4, 2016 at 4:41 PM, Greg Caporaso notifications@github.com wrote:

@nbokulich https://github.com/nbokulich, when @jairideout https://github.com/jairideout and I were testing this we discovered that the relative abundances are off in some of the expected taxonomy files. We wanted to confirm that the taxa abundances in each sample sum to 1.0 (to seven decimal places, the unitetest.assertFloatEqual default). We found that in some cases they're not equal even to two decimal places. For example, in mock-3, the sum of the values in one of the samples is 1.02. Would you be able to look into this? We have a test file that you can run now which will help you identify these samples (run python test/check_data_integrity.py in this repository - this is Python 3 only).

— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub https://github.com/gregcaporaso/mockrobiota/issues/3

gregcaporaso commented 8 years ago

It's an issue with a bunch of them, to varying degrees. I'm realizing now that a table that sums the values for all dataset/database/version combinations would help. Either @jairideout or I can generate this for you today.

jairideout commented 8 years ago

Fixed in #9.