bulik / ldsc

LD Score Regression (LDSC)
GNU General Public License v3.0
652 stars 345 forks source link

KeyError in munge_sumstats step #157

Open visivas opened 5 years ago

visivas commented 5 years ago

Hi, I am trying to generate summary statistics for a gwas output file in addition to using a merge file. My merge file is a very small subset of the GWAS file. In doing so I get the following KeyError. Any ideas on what could be going on here?

Read 9463160 SNPs from --sumstats file. Removed 9096987 SNPs not in --merge-alleles. Removed 0 SNPs with missing values. Removed 0 SNPs with INFO <= 0.9. Removed 0 SNPs with MAF <= 0.01. Removed 0 SNPs with out-of-bounds p-values. Removed 15286 variants that were not SNPs or were strand-ambiguous. 350887 SNPs remain. Removed 11 SNPs with duplicated rs numbers (350876 SNPs remain). Using N = 11190.0 Median value of beta was -2.60831695963e-05, which seems sensible. Removed 10 SNPs whose alleles did not match --merge-alleles (350866 SNPs remain).

ERROR converting summary statistics:

Traceback (most recent call last): File "/XXX/LDSC/ldsc-master/munge_sumstats.py", line 707, in munge_sumstats dat = allele_merge(dat, merge_alleles, log) File "/XXX/LDSC/ldsc-master/munge_sumstats.py", line 445, in allele_merge dat.loc[~jj, [i for i in dat.columns if i != 'SNP']] = float('nan') File "/home/anaconda/anaconda2/lib/python2.7/site-packages/pandas/core/indexing.py", line 193, in setitem indexer = self._get_setitem_indexer(key) File "/home/anaconda/anaconda2/lib/python2.7/site-packages/pandas/core/indexing.py", line 171, in _get_setitem_indexer return self._convert_tuple(key, is_setter=True) File "/home/anaconda/anaconda2/lib/python2.7/site-packages/pandas/core/indexing.py", line 242, in _convert_tuple idx = self._convert_to_indexer(k, axis=i, is_setter=is_setter) File "/home/anaconda/anaconda2/lib/python2.7/site-packages/pandas/core/indexing.py", line 1269, in _convert_to_indexer .format(mask=objarr[mask])) KeyError: '[-1 -1 -1 ... -2 -2 -2] not in index'

mxcai commented 5 years ago

I have got the same issue. Did you figure out the solution?

melothemightyone commented 5 years ago

I have got the same issue. Did you figure out the solution? Try older version of pandas(maybe <0.21.0).