Open wmwv opened 2 months ago
The Rubin/LSST images haven't been processed. Need some way of getting the image list. If we use the eimage files and assume the WCS is perfect we might be able to do this still.
On NERSC, the RomanDESCSim data are in
/global/cfs/cdirs/lsst/production/roman-desc-sims/
With RomanWFI/TDS data in
/global/cfs/cdirs/lsst/production/roman-desc-sims/Roman_data/RomanTDS/images/simple_model
And Rubin/LSST data in
/global/cfs/cdirs/lsst/production/roman-desc-sims/raw_data/
There's a sample reduction for the Rubin/LSST data done (by Jim Chiang?) with repo and collection
repo = "/global/cfs/cdirs/lsst/production/roman-desc-sims/preview_data_staging/preview_data/"
collection = "u/descdm/preview_data_step1_w_2024_12/20240326T152819Z/"
Ah, I was confused. Correct repo
, collection
repo = "/global/cfs/cdirs/lsst/production/gen3/roman-desc-sims/repo"
collections = ["u/descdm/preview_data_step1_w_2024_12"]
Here's a snippet to get the matching images for a given RA, Dec
from lsst.daf.butler import Butler
from lsst import sphgeom
transient_id = 30328322
ra, dec = 8.52941151,-43.0266337
mjd_start, mjd_end = 62300.0, 62600.0
repo = "/global/cfs/cdirs/lsst/production/gen3/roman-desc-sims/repo"
collections = ["u/descdm/preview_data_step1_w_2024_12"]
# The Step3/coadd collection is:
# collection = ["u/descdm/preview_data_step3_2877_19_w_2024_12"]
butler = Butler(repo, collections=collections)
butler.registry.queryCollections()
level = 10 # the resolution of the HTM grid
pixelization = sphgeom.HtmPixelization(level)
htm_id = pixelization.index(
sphgeom.UnitVector3d(
sphgeom.LonLat.fromDegrees(ra, dec)
)
)
dataset_refs = butler.registry.queryDatasets("calexp", htm20=htm_id)
Reading and processing Roman and Rubin data !
Next need to rethinking dataset, which is now per-function, but needs to be per row.
Dropped dataset. Just tracking based on instrument
.
Next steps:
Selecting supernova with:
import numpy as np
from astropy.table import Table
file = "/global/cfs/cdirs/descssim/imSim/skyCatalogs_v1.1.2/snana_10306.parquet"
df = Table.read(file)
w, = np.where((df["z_CMB"] < 0.7) & (df["ra"] > 8.3) & (df["ra"] < 8.7) & (df["dec"] > -43.5) & (df["dec"] < -42.5) & (df["start_mjd"] > 62200))
yields 391 candidates.
[Edit: Give absolute path for skyCatalogs.]
Of the first 10 and last 10 candidates of those 391, the following three had Rubin simulated images during the SN that have been processed out of the RomanDESCSims:
50130277 50132692 110000220
had matching images.
Updated to include collection with many more (all?) of the images. Getting much long before and during lists now. (Not as many after, because the end date is super conservative and the survey ends).
import numpy as np
from astropy.table import Table
file = "/global/cfs/cdirs/descssim/imSim/skyCatalogs_v1.1.2/snana_10306.parquet"
df = Table.read(file)
w, = np.where((df["z_CMB"] < 0.4) & (df["ra"] > 8.3) & (df["ra"] < 8.7) & (df["dec"] > -43.5) & (df["dec"] < -42.5) & (df["start_mjd"] > 62200))
df[w]["id", "ra", "dec", "z_CMB", "start_mjd", "end_mjd"].write("list.csv")
Run a supernova from RomanDESCSim with both Roman and Rubin/LSST data