Closed moustakas closed 3 years ago
I am sorry for the delay. Tagging myself on this issue.
I've confirmed that this is an issue and I think there's a straightforward fix.
The problem is that the following catalogs
/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
only have a small fraction of the columns normally found in a targets catalog. Consequently, when fiberassign is/was run, the columns which are usually populated like FLUX_[G,R,Z]
, PHOTSYS
, etc., are empty, e.g.,
from astropy.table import Table
info
<Table length=1>
TARGETID TARGET_RA TARGET_DEC
int64 float64 float64
----------------- ---------------- ----------------
39633489506601735 149.490652378136 70.0725222414694
fm = Table.read('/global/cfs/cdirs/desi/target/fiberassign/tiles/trunk/080/fiberassign-080696.fits.gz', 'FIBERASSIGN')
targ = Table.read('/global/cfs/cdirs/desi/target/fiberassign/tiles/trunk/080/fiberassign-080696.fits.gz', 'TARGETS')
fm = fm[fm['TARGETID'] == info['TARGETID']]
targ = targ[targ['TARGETID'] == info['TARGETID']]
fm['PHOTSYS', 'FLUX_G', 'FLUX_R', 'FLUX_Z']
<Table length=1>
PHOTSYS FLUX_G FLUX_R FLUX_Z
bytes1 float32 float32 float32
------- ------- ------- -------
0.0 0.0 0.0
This particular target, however, has perfectly valid photometry from DR9:
import astropy.units as u
from astropy.coordinates import SkyCoord
tt = Table(fitsio.read('/global/cfs/cdirs/cosmo/data/legacysurvey/dr9/north/sweep/9.0/sweep-140p070-150p075.fits'))
c1 = SkyCoord(ra=tt['RA']*u.deg, dec=tt['DEC']*u.deg)
c2 = SkyCoord(ra=info['TARGET_RA']*u.deg, dec=info['TARGET_DEC']*u.deg)
indx, _, _ = c2.match_to_catalog_sky(c1)
tt[indx]['FLUX_G', 'FLUX_R', 'FLUX_Z', 'RELEASE']
<Table length=1>
FLUX_G FLUX_R FLUX_Z RELEASE
float32 float32 float32 int16
--------- --------- -------- -------
1.6022577 4.1805983 9.720711 9011
It's this photometry, of course, which I need as input to fastspecfit
to do my SED modeling.
My proposed solution is to create two new row-matched catalog to these two files
/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
which contain the original DR9 photometry (plus a couple other columns normally generated by the target-selection code like PHOTSYS
), and then I'll reach those catalogs for the handful of tiles which contain these secondary targets.
@geordie666 is this a reasonable solution?
If so, @stephjuneau or @Ragadeepika-Pucha would you be willing to generate these catalogs?
@moustakas: What you propose is certainly a reasonable solution for the small subset of cases where secondary targets (i) exist in DR9 and;(ii) were observed as part of sv1. For wider, general context:
TARGETID
(which encodes RELEASE
) from DR9 in the "standalone" secondary files (e.g. /global/cfs/cdirs/desi/target/catalogs/dr9/0.57.0/targets/sv3/secondary/dark/sv3targets-dark-secondary.fits
)Thanks @geordie666, I'm not planning to support targets without DR9 photometry.
For the record here's my current procedure (please comment!):
coadd-*.fits
file and corresponding redrock-*.fits
file, read the REDSHIFTS
, FIBERMAP
and EXP_FIBERMAP
extensions. z>0 & zwarn <= 4 & objtype='TGT' & coadd_fiberstatus==0 & (photsys == 'N' | photsys == 'S')
.TILEID
in the EXP_FIBERMAP
table. Use these headers to infer the target catalog directory(ies) used during fiber-assignment via the TARG
, TARG2
, TARG3
, TARG4
, SCND
(if present) header cards.['TARGETID', 'RA', 'DEC', 'FIBERFLUX_G', 'FIBERFLUX_R', 'FIBERFLUX_Z', 'FIBERTOTFLUX_G', 'FIBERTOTFLUX_R', 'FIBERTOTFLUX_Z', 'FLUX_G', 'FLUX_R', 'FLUX_Z', 'FLUX_W1', 'FLUX_W2', 'FLUX_IVAR_G', 'FLUX_IVAR_R', 'FLUX_IVAR_Z', 'FLUX_IVAR_W1', 'FLUX_IVAR_W2']
The trouble with this sequence is that the SV1 targets with PHOTSYS
blank (including the low-mass AGN that @Ragadeepika-Pucha and @stephjuneau are interested in) get dropped in step [2] because of the PHOTSYS
cut.
So what I'd really like to do is to remove the PHOTSYS
cut from step [2], read PHOTSYS
in step [4] (after we've generated new row-matched photometric catalogs for the SV1 targets), and add an additional selection cut in or after step [4] to remove objects that don't have DR9 photometry (unless there's a way for me to glean that information in step [2]???).
I think that sequence is correct @moustakas. The only way I can think of to (potentially) simplify step (2) is to make an explicit check on whether a target is from sv1
. If the target has photsys
blank, but is from sv1
you would then retain the object for special treatment. It's really only sv1
that's the problem, here, I think.
There might be header information corresponding to sv1
in some of the files you're reading. But, certainly, if the target columns look like, e.g., SV1_DESI_TARGET
in the coadd file, then its an sv1
file. To check whether targets are from sv1
, you can use desitarget.targets.main_cmx_or_sv
, e.g.:
import fitsio
from desitarget.targets import main_cmx_or_sv
objs = fitsio.read("/global/cfs/cdirs/desi/spectro/redux/daily/tiles/cumulative/80865/20210510/coadd-3-80865-thru20210510.fits", "FIBERMAP")
cols, Mx, survey = main_cmx_or_sv(objs)
survey
Out[]: 'sv1'
@moustakas - I can create the catalogs that you need for the SV1 secondary targets. Can you confirm what columns you need from DR9 photometry? Are these the ones from step [4]? I can add blank PHOTSYS and '-99' for the rest of the columns for the targets without available DR9 photometry. Or would you suggest something else?
Terrific, @Ragadeepika-Pucha, thanks. If you could please propagate every column that's in the DR9 sweep files (plus the TARGETID
that's in the targets catalog now and PHOTSYS
), that way we won't have to do this again.
To determine which SV1 secondary targets should have DR9 photometry (rather than having to match on RA,Dec), I just checked that the trick @geordie666 gave us above works beautifully:
import numpy as np
from astropy.table import Table
from desitarget.targets import decode_targetid
tt = Table.read('/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits')
objid, brickid, releases, _, _, _ = decode_targetid(tt['TARGETID'])
print(np.count_nonzero(releases > 9000), len(tt))
18882222 22354100
(And we can use objid
and brickid
to find the right row of the sweep catalog.)
Oh and let's use zero for empty column values (not -99). That's my preference at least. Thanks!
@Ragadeepika-Pucha has prepared row-matched catalogs---thank you!
@geordie666 could you help review?
ioannis@cori09:~% ls -l /global/u1/r/raga19/data/
total 23530324
-rw-rw-r--+ 1 raga19 desi 2007959040 Aug 11 21:59 sv1targets-bright-secondary-dr9photometry.fits
-rw-rw-r--+ 1 raga19 desi 21236420160 Aug 13 10:51 sv1targets-dark-secondary-dr9photometry.fits
cd /global/u1/r/raga19/data/
fitsinfo sv1targets-dark-secondary-dr9photometry.fits
Filename: sv1targets-dark-secondary-dr9photometry.fits
No. Name Ver Type Cards Dimensions Format
0 PRIMARY 1 PrimaryHDU 4 ()
1 1 BinTableHDU 250 22354100R x 121C [K, 1A, K, K, 8A, K, 3A, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, K, K, K, K, K, K, K, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, D, K, K, K, K, K, K, K, K, D, D, D, D, D, D, D, D, D, D, D, 8A, D, D, D, D, D, D, D, D, D, D, D, D, 2A, K, D, D, D, D, D, D, D, D, K, D, D, K, D, D, D, D, D, D, K, K, D, D]
fitsinfo /global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
Filename: /global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
No. Name Ver Type Cards Dimensions Format
0 PRIMARY 1 PrimaryHDU 6 ()
1 SCND_TARGETS 1 BinTableHDU 71 22354100R x 16C [D, D, E, E, E, L, E, E, E, K, K, K, D, K, K, K]
I can help review but I'm in meetings the rest of the day, so won't get to this until tomorrow morning.
I ran through 1000 secondary targets from each of the "bright" and "dark" files, and I confirm that everything seems correct in Raga's files. Here's the snippet of code I used:
rows_to_test = 1000
import fitsio
from time import time
import numpy as np
import os
from desitarget.targets import encode_targetid
for obscon in "bright", "dark":
secs = fitsio.read("/global/u1/r/raga19/data/sv1targets-{}-secondary-dr9photometry.fits".format(obscon), upper=True)
secsnophot = fitsio.read("/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/{}/sv1targets-{}-secondary.fits".format(obscon, obscon), upper=True)
secsnophot = secsnophot[secs["RA"] < 1e20]
secs = secs[secs["RA"] < 1e20]
print("Working with {} {} secondaries".format(len(secs), obscon))
print("TARGETIDs match: {}".format(np.all(secs["TARGETID"] == secsnophot["TARGETID"])))
alltrac = []
start = time()
for i, objs in enumerate(secs[:rows_to_test]):
ns = ["south", "north"][objs["PHOTSYS"] =='N']
brickname = objs["BRICKNAME"]
fn = os.path.join("/global/cfs/cdirs/cosmo/data/legacysurvey/dr9/{}/tractor/".format(ns), "{}/tractor-{}.fits".format(brickname[:3], brickname))
trac = fitsio.read(fn, upper=True)
alltrac.append(trac[trac["OBJID"] == objs["OBJID"]])
if i%100 == 99:
print("Checked {}/{} secondaries...t={:.1f}s".format(i+1, len(secs), time()-start))
alltrac = np.concatenate(alltrac)
tids = encode_targetid(objid=alltrac["OBJID"], brickid=alltrac["BRICKID"], release=alltrac["RELEASE"])
print("Tractor TARGETIDs match: {} for {} secondaries".format(np.all(tids == secs[:rows_to_test]["TARGETID"]), rows_to_test))
Depending on how formal @moustakas wants to make these files, we might want more informative header information. We might also want a different schema than just setting all of the columns to "ridiculous" values when there is no match with the Legacy Surveys Tractor catalogs.
If these files are only ever used for spectrophotometric calibration of a few SV1 secondary targets, I'm not sure we want to be too pedantic, though.
Thanks for the checks, @geordie666.
@Ragadeepika-Pucha, is it straightforward for you to make a couple changes to these catalogs? The biggest change is that the data type of each column in the catalogs you made should match the corresponding column in the sweep catalogs. (For example, in your catalogs sersic
is float64
but in the sweeps it's float32
. Also, in the sweeps the column names are capitalized, but that's a minor difference.)
I also agree with @geordie666 that for non-matches we should have simpler "blank" values. For non-matches it looks like you're using N/A
for strings (although photsys
for non-matches is N
, which I think we should definitely change) and either 999999
or 1e+20
for floats. For simplicity, I suggest using zero (or minus one, although negative values can cause problems sometimes).
Are these changes easy for you?
In case it helps, the full data model for every column in the sweeps files is here:
@moustakas -- I think it is pretty straightforward for me to 1) change the datatype of columns, and 2) change the column names to capital letters. However, I am not completely sure about how to insert zero values for all the columns (given different datatypes) for non-matches. I will give it a try and let you know. @geordie666 - Thank you for the datamodel.
@moustakas - I have changed the datatype of columns and changed the names to capital letters. In the case of non-matches -- I filled it with 0 and 0.0 for int and float columns respectively -- and empty strings in 'S' dtype columns. There is a bool column - which is marked as '?' in dtype -- I filled the non-matches as False in this column. Is this fine? If yes, I will double-check everything and publish the new tables.
@Ragadeepika-Pucha this all sounds great, thank you.
The catalogs are sv1targets-bright-secondary-dr9photometry.fits
and sv1targets-dark-secondary-dr9photometry.fits
- These are located in the /global/u1/r/raga19/data/
. To summarize -
Hopefully, this works fine. Let me know if there is anything more. @moustakas Thank you.
Thank you, @Ragadeepika-Pucha! I spot-checked the new catalogs and everything looks good.
@geordie666 could you copy these catalogs into place so I can more fully test them within the fastspecfit
pipeline? (I don't have permission to do so.)
cp /global/u1/r/raga19/data/sv1targets-dark-secondary-dr9photometry.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/
cp /global/u1/r/raga19/data/sv1targets-bright-secondary-dr9photometry.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/bright/
OK, these files are now in the appropriate directories.
Thanks, @geordie666.
I finally got around to testing these catalogs and realized that we missed a few other locations.
@Ragadeepika-Pucha, we need row-matched DR9 photometry for the following SV1 catalogs, too, in order for me to accommodate all the SV1 tiles. Is it straightforward for you to make them?
-r--r----- 1 62359 desi 257394240 Jan 9 2021 /global/cfs/cdirs/desi/target/catalogs/dr9/0.48.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
-r--r----- 1 62359 desi 214087680 Jan 19 2021 /global/cfs/cdirs/desi/target/catalogs/dr9/0.49.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
-r--r----- 1 62359 desi 252198720 Mar 7 15:07 /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
-r--r----- 1 62359 desi 271535040 Mar 14 05:14 /global/cfs/cdirs/desi/target/catalogs/dr9/0.52.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
and
-r--r----- 1 62359 desi 4390352640 Jan 9 2021 /global/cfs/cdirs/desi/target/catalogs/dr9/0.48.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
-r--r----- 1 62359 desi 2177593920 Jan 19 2021 /global/cfs/cdirs/desi/target/catalogs/dr9/0.49.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
-r--r----- 1 62359 desi 2654876160 Mar 7 15:07 /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
-r--r----- 1 62359 desi 2686466880 Mar 14 05:14 /global/cfs/cdirs/desi/target/catalogs/dr9/0.52.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
@moustakas: Are you sure that's the correct list? Anand showed me the following neat trick for determining which target versions were actually observed on tiles:
d = Table.read("/global/cfs/cdirs/desi/users/raichoor/fiberassign-rerun/fiberassign-history.ecsv")
d = d[d["SURVEY"] == "sv1"]
for cat in np.unique(d["DESITARGET_CAT"]):
sel = d["DESITARGET_CAT"] == cat
print("{} : {} tiles".format(cat, sel.sum()))
...
DESIROOT/target/catalogs/dr9/0.47.0/targets/cmx/resolve/no-obscon/ : 1 tiles
DESIROOT/target/catalogs/dr9/0.47.0/targets/sv1/resolve/bright/ : 47 tiles
DESIROOT/target/catalogs/dr9/0.47.0/targets/sv1/resolve/dark/ : 34 tiles
DESIROOT/target/catalogs/dr9/0.50.0/targets/sv1/resolve/bright : 18 tiles
DESIROOT/target/catalogs/dr9/0.50.0/targets/sv1/resolve/dark : 17 tiles
DESIROOT/target/catalogs/dr9/0.51.0/targets/sv1/resolve/dark : 29 tiles
DESIROOT/target/catalogs/gaiadr2/0.48.0/targets/sv1/resolve/supp : 2 tiles
So, it seems like we did observe version 0.47.0
for sv1
but we did not observe 0.52.0
?
What I did was much dumber, @geordie666. I didn't consider what was observed, I just did:
ls -l /global/cfs/cdirs/desi/target/catalogs/dr9/*/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
ls -l /global/cfs/cdirs/desi/target/catalogs/dr9/*/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
Maybe someone will want DR9 photometry for these intermediately-tagged catalogs even if we never got on-sky with them? I dunno.
In any case, @Ragadeepika-Pucha, in order for me to push your SV1 sample through fastspecfit I need DR9 photometry for tags 0.47.0 bright/dark
and 0.51.0 dark
(you already did 0.50.0 bright/dark
and 0.48.0
is Gaia, which I don't think we care about.
Hey, Anand's the smart one. I think you're correct that we should just run all of them if @Ragadeepika-Pucha is willing? It's probably best to process the files while this is fresh-in-the-memory, even if nobody ultimately needs the extra catalogs.
@moustakas @geordie666 - I will work on them this week and keep you posted. Thanks!
@moustakas - Can you please confirm which files you are requiring photometry for?
0.47.0
does not have any secondary/*
files. The sv1targets-*-secondary.fits
files that I found are the following -
/global/cfs/cdirs/desi/target/catalogs/dr9/0.48.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.49.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.52.0/targets/sv1/secondary/bright/sv1targets-bright-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.48.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.49.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
/global/cfs/cdirs/desi/target/catalogs/dr9/0.52.0/targets/sv1/secondary/dark/sv1targets-dark-secondary.fits
The resolve
folder is probably not secondary targets? The 0.47.0
has only resolve
folder.
I am working on the 0.51.0
version now - Let me know if all the ten files mentioned above are the correct ones?
That list looks right @Ragadeepika-Pucha! (I didn't mention 0.50.0
because you already made those catalogs.)
@moustakas - all the files are ready and are located here - /global/cfs/cdirs/desi/users/raga19/data/
All the versions 0.*.0/
folders have one file corresponding to 'bright' and one file corresponding to 'dark' files, except 0.48.0
.
The dark
catalog for 0.48.0
had ~45.26 million rows and I couldn't save the whole table into a single fits file because of memory issues. The exclusive cpu node was not starting for me properly - so, I saved that into two tables - sv1targets_dark_secondary_p1.fits
and sv1targets_dark_secondary_p2.fits
. The first one has the first 22 million rows and the second one has the rest of the rows. I will try to combine them again tomorrow - but, my jupyter notebook keeps crashing.
Let me know if there are any other issues or anything else. Thank you.
Thank you, @Ragadeepika-Pucha!
I was able to combine the troublesome catalogs in a login node with
import fitsio
from astropy.table import Table, vstack
cd /global/cfs/cdirs/desi/users/raga19/data/0.48.0
t1 = Table(fitsio.read('sv1targets_dark_secondary_dr9_p1.fits'))
t2 = Table(fitsio.read('sv1targets_dark_secondary_dr9_p2.fits'))
out = vstack((t1, t2))
out.write('sv1targets_dark_secondary_dr9.fits')
And then I fixed permissions. I also spot-checked the merged catalog and everything looks OK as far as I can tell.
@geordie666 could we please copy these catalogs into place? Please feel free to double-check me on these commands! Also, I propose we overwrite the existing 0.50.0
DR9 photometric catalogs in case there were any changes to @Ragadeepika-Pucha's code.
cp /global/cfs/cdirs/desi/users/raga19/data/0.48.0/sv1targets_dark_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.48.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.48.0/sv1targets_bright_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.48.0/targets/sv1/secondary/bright/sv1targets-bright-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.49.0/sv1targets_dark_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.49.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.49.0/sv1targets_bright_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.49.0/targets/sv1/secondary/bright/sv1targets-bright-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.50.0/sv1targets_dark_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.50.0/sv1targets_bright_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.50.0/targets/sv1/secondary/bright/sv1targets-bright-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.51.0/sv1targets_dark_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.51.0/sv1targets_bright_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/bright/sv1targets-bright-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.52.0/sv1targets_dark_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.52.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
cp /global/cfs/cdirs/desi/users/raga19/data/0.52.0/sv1targets_bright_secondary_dr9.fits /global/cfs/cdirs/desi/target/catalogs/dr9/0.52.0/targets/sv1/secondary/bright/sv1targets-bright-secondary-dr9photometry.fits
The catalogs have now been copied into place.
Thanks @geordie666.
Testing a handful of @Ragadeepika-Pucha's galaxies with
fastspec /global/cfs/cdirs/desi/spectro/redux/everest/healpix/sv1/dark/91/9157/redrock-sv1-dark-9157.fits \
-o fastspec.fits --specprod everest --mp 7 \
--targetids 39632936277902799,39632931181824699,39632936277905113,39632936273709365,39632936282096442,39632936273708359,39632931177631304
I get an error:
INFO:io.py:166:find_specfiles: Parsed coadd_type=healpix
INFO:io.py:63:_get_targetdirs: Reading /global/cfs/cdirs/desi/target/fiberassign/tiles/trunk/080/fiberassign-080896.fits.gz header.
INFO:io.py:341:find_specfiles: Matched 3 targets in /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/resolve/dark/sv1targets-dark-hp-143.fits
INFO:io.py:341:find_specfiles: Matched 4 targets in /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
INFO:io.py:63:_get_targetdirs: Reading /global/cfs/cdirs/desi/target/fiberassign/tiles/trunk/080/fiberassign-080895.fits.gz header.
INFO:io.py:341:find_specfiles: Matched 3 targets in /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/resolve/dark/sv1targets-dark-hp-143.fits
INFO:io.py:341:find_specfiles: Matched 4 targets in /global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/secondary/dark/sv1targets-dark-secondary-dr9photometry.fits
Traceback (most recent call last):
File "/global/homes/i/ioannis/repos/desihub/fastspecfit/bin/fastspec", line 9, in <module>
fastspec(comm=None)
File "/global/homes/i/ioannis/repos/desihub/fastspecfit/py/fastspecfit/fastspecfit.py", line 204, in fastspec
Spec.find_specfiles(args.redrockfiles, firsttarget=args.firsttarget,
File "/global/homes/i/ioannis/repos/desihub/fastspecfit/py/fastspecfit/io.py", line 346, in find_specfiles
targets = Table(np.hstack(targets))
File "<__array_function__ internals>", line 5, in hstack
File "/opt/conda/lib/python3.8/site-packages/numpy/core/shape_base.py", line 344, in hstack
return _nx.concatenate(arrs, 0)
File "<__array_function__ internals>", line 5, in concatenate
ValueError: could not convert string to float: 'N'
If I try to use astropy.table.vstack
I get
*** NotImplementedError: vstack unavailable for mixin column type(s): NdarrayMixin
so something is amiss.
Unfortunately I'm out of time right now to dig in case anyone else is inspired...
@moustakas - The error comes from the PHOTSYS
column from the catalogs I created. The datatype of TARGETID
matches with the targets catalog and the datatype of LS DR9 columns match the sweep files. However, the PHOTSYS
column is 'S1' in my catalogs, while it is '<U1' in all the everest files. I am sorry for the confusion - I am making the changes on all the catalogs now.
@geordie666 - Is it possible to make the changes directly to the catalogs wherever you have staged them? Or should I let you know about the changed catalogs?
@Ragadeepika-Pucha: Let me know the location of the changed files once they're ready and then I'll transfer them to the write-protected locations.
@moustakas - I finished making the changes to the PHOTSYS
column for all the catalogs. The 0.48.0
catalogs - *p1.fits
and *p2.fits
needs to be combined again. Can you test the code again using these versions before transferring them to their locations? Thanks!
Thanks @Ragadeepika-Pucha, I'll test the catalogs but if you could merge your two partial catalogs that'd be helpful. I gave the snippet of code above and you just need to log in and do
source /global/cfs/cdirs/desi/software/desi_environment.sh master
@moustakas - I cannot merge the two partial catalogs because of memory issues. I am uable to use my exlusive cpu node
access and the normal cori access keeps crashing because of the huge number of rows.
I'm getting the same error. I don't quite understand it, but it looks like it's the order of the columns in your catalog that's the issue, and so when I try to stack them, np.hstack
complains.
Here's a snippet of output just so I don't forget (no action items, I'm working on it).
(Pdb) Table(fitsio.read('/global/cfs/cdirs/desi/target/catalogs/dr9/0.51.0/targets/sv1/resolve/dark/sv1targets-dark-hp-143.fits', rows=0, columns=targetcols))
<Table length=1>
RA DEC FLUX_G FLUX_R FLUX_Z FLUX_IVAR_G ... FIBERFLUX_Z FIBERTOTFLUX_G FIBERTOTFLUX_R FIBERTOTFLUX_Z PHOTSYS TARGETID
float64 float64 float32 float32 float32 float32 ... float32 float32 float32 float32 str1 int64
---------------- ----------------- --------- --------- --------- ----------- ... ----------- -------------- -------------- -------------- ------- -----------------
227.906196819581 32.89192548503159 5.0230317 18.375298 54.065174 372.7904 ... 41.97429 3.901896 14.273912 42.05138 N 39632941373980741
(Pdb) Table(targets[0])
<Table length=3>
RA DEC FLUX_G FLUX_R FLUX_Z FLUX_IVAR_G ... FIBERFLUX_Z FIBERTOTFLUX_G FIBERTOTFLUX_R FIBERTOTFLUX_Z PHOTSYS TARGETID
float64 float64 float32 float32 float32 float32 ... float32 float32 float32 float32 str1 int64
------------------ ----------------- --------- --------- ---------- ----------- ... ----------- -------------- -------------- -------------- ------- -----------------
226.96923847156185 32.44379674657387 1.6316719 3.2548742 2.3007507 320.8553 ... 1.7891678 1.2688626 2.5311382 1.7891685 N 39632931177631304
227.2240108136138 32.68367592661154 4.2744813 8.934247 14.078902 256.47095 ... 5.350051 1.625701 3.3975642 5.3531094 N 39632936282096442
227.14437016623828 32.75792289396391 4.084077 9.0138445 15.2085285 427.11798 ... 8.869176 2.3817163 5.2566147 8.869176 N 39632936277905113
(Pdb) Table(targets[1])
<Table length=4>
TARGETID PHOTSYS RA DEC FLUX_G FLUX_R ... FIBERFLUX_G FIBERFLUX_R FIBERFLUX_Z FIBERTOTFLUX_G FIBERTOTFLUX_R FIBERTOTFLUX_Z
int64 str1 float64 float64 float32 float32 ... float32 float32 float32 float32 float32 float32
----------------- ------- ------------------ ------------------ -------- -------- ... ----------- ----------- ----------- -------------- -------------- --------------
39632931181824699 N 227.2130001384101 32.414673886647066 1.594635 4.088936 ... 0.908661 2.329973 4.562993 0.908661 2.329973 4.562993
39632936277902799 N 226.97106504521375 32.825430731213345 2.542069 6.002813 ... 1.126328 2.659698 4.918194 1.126328 2.659698 4.918194
39632936273708359 N 226.66432745704736 32.67825968016942 2.408377 4.597582 ... 1.344683 2.566995 4.09719 1.344683 2.566995 4.09719
39632936273709365 N 226.7409700725095 32.81154396638427 2.066544 6.093318 ... 1.300753 3.83534 7.052577 1.300753 3.83534 7.052577
If it is the order of the columns -- then you can maybe try this -
If Table(targets[0]) is table1
and Table(targets[1]) is table2
then -
columns = table1.colnames
table2_new = table2[columns]
stacked_table = astropy.table.vstack((table1, table2_new))
I am not sure why you want to do hstack
- if the purpose is to make one single table with all the sources, then shouldn't we be using vstack
?
OK, I capture the outstanding issue in code in fastspecfit
but if we think anyone else is going to be stacking these SV1 catalogs with non-SV1 catalogs then we should probably fix this.
@geordie666 can you please move the files into their final resting place? Note that I combined the 0.48.0
dark catalogs for @Ragadeepika-Pucha as requested, so we should be good.
@Ragadeepika-Pucha, using np.hstack
with a list of np.ndarray
tables is much faster than using astropy.table.vstack
on a list of astropy.table.Table
tables. I can't remember who I learned this trick from---I think @tskisner---but whoever it was, thank you!
OK, files should now be in place with the correct permissions at:
ls -alrt /global/cfs/cdirs/desi/target/catalogs/dr9/*/targets/sv1/secondary/*/sv1targets-*-secondary-dr9photometry.fits
Thanks @geordie666. If there's justice in the world, this will close this ticket. (But I'll reopen if there are any issues in production...)
Unfortunately it looks like the main-survey secondary target catalogs are also missing PHOTSYS
, among other quantities:
fitsio.FITS(targetfile)['SCND_TARGETS']
file: /global/cfs/cdirs/desi/target/catalogs/dr9/1.1.1/targets/main/secondary/dark/targets-dark-secondary.fits
extension: 1
type: BINARY_TBL
extname: SCND_TARGETS
rows: 7125595
column info:
RA f8
DEC f8
PMRA f4
PMDEC f4
REF_EPOCH f4
OVERRIDE b1
FLUX_G f4
FLUX_R f4
FLUX_Z f4
PARALLAX f4
GAIA_PHOT_G_MEAN_MAG
f4
GAIA_PHOT_BP_MEAN_MAG
f4
GAIA_PHOT_RP_MEAN_MAG
f4
GAIA_ASTROMETRIC_EXCESS_NOISE
f4
TARGETID i8
DESI_TARGET i8
SCND_TARGET i8
SCND_ORDER i4
SUBPRIORITY f8
OBSCONDITIONS i8
PRIORITY_INIT i8
NUMOBS_INIT i8
Because (at least, as I recall) I always had the code in place for matching back to the Legacy Surveys for Main Survey secondary targets, I think that there should be two differences with the Main Survey files:
(1) The fluxes should be included.
(2) The incarnation at the top of this thread should recover PHOTSYS
.
I'm not sure if you want other columns (e.g. the WISE fluxes) but (again, I think) at least (1) and (2) should be available.
I think this issue has been handled in the latest (Everest) release, but let's open a new ticket if other issues are uncovered.
Summarizing an off-list conversation with Raga, @geordie666, and @stephjuneau:
To get PHOTSYS from TARGETID (thanks to @geordie666):
This should return "N"/"S" as expected, or None if the target has no Legacy Surveys provenance (by which I mean that no valid TARGETID was propagated from dr9 for the target).
But then Raga found: