AstarVienna / ScopeSim

A telescope observation simulator for Python.
GNU General Public License v3.0
16 stars 10 forks source link

Ideas for uniting spectroscopic mode infrastructure #380

Open astronomyk opened 8 months ago

astronomyk commented 8 months ago

Pinging @teutoburg and @hugobuddel in case you have ideas of what (not) to do...

Status of Spectroscopy in ScopeSim

ScopeSim is being used to simulate 5 spectroscopic modes of 4 instruments (MICADO-LSS, METIS-LSS, METIS-IFU, MOSAIC-MOS, MAAT-IFU). In the future we will also want to add instruments like HARMONI (IFU) and MOONS (MOS). Currently each version has a unique scopesim solution, based to varying degrees on the SpectralTraceList object, or re-using parts of the underlying mechanics (e.g. XiLamImage).

This approach is not very sustainable in the long run, and also does not lend itself to expansions to other instruments. As such it is probably useful sooner rather than later to refactor (or overhaul) the spectroscopic components of ScopeSim. Not least because a lot of the existing code still relies on the ~depreciated~ deprecated fov_grid method.

Where we want to end up

Below is a flowchart of the various ways spectroscopic instruments could be represented within the ScopeSim environment. The common elements are that every spectroscope starts with one or more entrance apertures (which map to FOVs), and produces one or more spectral traces on the detector plane.

There are various combinations of the aperture-trace connections:


flowchart TD

    MICADO_LSS{{MICADO_LSS}} --> LSS([LSS])
    METIS_LSS{{METIS_LSS }} --> LSS
    METIS_IFU{{METIS_IFU}} --> IFU([IFU])
    HARMONI{{HARMONI}} --> IFU
    MOSAIC{{MOSAIC}} --> MOS([MOS])
    MOONS{{MOONS}} --> MOS

    LSS -- Single FOV --> ApertureList_1
    ApertureList_1 -- Long Slit --> SubApertureList_1
    SubApertureList_1 --  Single Trace (N=1) \n (METIS LSS) --> SpectralTraceList
    SubApertureList_1 -- Multiple Traces (N>1) \n (MICADO LSS) --> SpectralTraceList

    IFU -- Single FOV --> ApertureList_1
    ApertureList_1 -- Full IFU window --> SubApertureList_N
    SubApertureList_N -- Multiple Traces (N>1) \n (METIS IFU) \n (HARMONI) --> SpectralTraceList

    MOS -- Multiple FOVs --> ApertureList_N
    ApertureList_N -- Fibre Bundles --> SubApertureList_N
    SubApertureList_N -- Multiple traces per FOV \n (MOSAIC) --> UnresolvedSpectralTraceList_N
    ApertureList_N -- Single trace per FOV \n (MOONS) --> UnresolvedSpectralTraceList_N

Noteworthy is that there are also two different types of spectral traces: traces which preserve the along slit spatial information (e.g. LSS, IFU) and those that lose the spatial information (e.g. fibre-fed MOS or Echelle)

New or refurbished classes required to generalise scopesim-spectroscopy

Several of these classes already exist in some form inside ScopeSim, however they may need signiicant modification to fit in with the new scheme proposed above.

ApertureList

This should be used to only define the regions on the sky that should be mapped to FieldOfView objects. For LSS or IFU instruments this should map to the full on-sky region covered by the slit or image slicer. In this case it would be a list containing a single entry. For MOS instruments, this would be a list containing N entries, one per fibre or fibre bundle.

SubApertureList

There are cases were it is imperfect to use a list of apertures, each mapped to a FieldOfView object. For example, image slicer IFUs should not be seen as N independent long-slits (a major problem with the current scopesim design), as the outer regions of the PSF will overlap with the adjacent "quasi-long-slits" that are the other image slices. For all pre-image-slicer effects (e.g. ADC, PSF) scopesim should only be using a single FOV that covers the full image slicer aperture. However each trace on the detector should map to an extracted sub-FOV that is mapped to each separate quasi-slit.

A similar case can be made for the fibre bundles expected for the MOSAIC MOS (X=7) and IFU (X=91) modes of the instrument.

For this purpose scopesim needs a SubApertureList to describe these sub-FOV regions and their mapping to the detector traces without disregarding the information about the on-sky region covered by the primary Aperture. This should be an optional Effect for spectroscopic modes, as it should be needed to define two N=1 Aperture-like objects for a simple long-slit instrument (e.g. METIS-LSS).

SpectralTraceList

This effect class describes the mapping of (sub-)apertures to the traces seen on the detector. The existing Effect object relies on several internal classes (e.g. SpectralTraceand XilamImage) which themselves are not Effect objects. In order to simplify the interface, it would make sense for to use a SpectralTraceList object for all cases (N=1 and N>1), as a single trace can simple be a list with one entry.

The existing version of this class retains the spatial information along the slit (dimension Xi). It may therefore be prudent to rename this as Resolved- or SpatiallyCoherentSpectralTrace, so the distinction to fibre-fed traces is clear.

This class will need to be refactored in order to remove the need for a .fov_grid method.

UnresolvedSpectralTraceList

This effect is essentially the 1-D version of the SpectralTraceList for fibre-fed spectrographs, where the along-slit spatial information is lost. In it's current form it makes use of the XiLamImage object, but this is a very hacky way of doing things.

Way forward

M.V.P

Class descriptions

.. todo: add class descriptions here

Addendum - Problems that this could solve with current ScopeSim spec modes

Dimensioning of the ImagePlane and FieldOfView objects is currently done by looking for either Apertureobjects or DetectorArray objects (or both). WIth the new scheme, the FOVs could be defined simply by the Aperture effects, and the ImagePlane can be tied solely to the DetectorArray effect objects.

teutoburg commented 8 months ago

Just a quick thought on naming things: Maybe the not-spatially-coherent variant could be called FiberTraceList or something along those lines. Or is there any cass where it would be used that doesn't involve an optical fibre?

hugobuddel commented 8 months ago

The Field of View handling is a part of ScopeSim that I don't yet fully understand. Which perhaps is a good thing, because that means it does actually work well enough that I never had to look into it, while it is very important. So maybe this is a good time to learn.

Could you perhaps provide a legend to the figure? Let me try to figure it out.

I don't understand the need for the SubApertureList, which is probably a good point to highlight my misunderstanding of the whole system. It seems to me that it would be fine to see image slicer IFU's as N independent long-slits (each with their own FoV), and then somewhere at the end combine all those FoVs into a single image. Any overlap can be dealt with in this combining phase without any of those long-slit FoVs needing any knowledge of the other long slits. Or is this naive?

Perhaps you could describe your new scheme in a telecon?

teutoburg commented 8 months ago

I updated the shapes of the boxes to represent different things (modes vs. "meta-modes" vs. classes).

astronomyk commented 8 months ago

Sorry for the delay - sick wife on Friday, so I had to take a day of carers leave.

Please don't try to map rhis diagramme too closely to existing scopesim structures. Its more to show the series of concepts that we'll need to consider for a generalised spectroscopic infrastructure for scopesim, than an accurate implementation model. -> Indeed lets discuss it in detail in a telecon (monday?)

One quick comment though regarding the (indeed naive) virw that an IFU can be represented as a series of long-slits. This was my approach too initially, so I have been down this rabbit hole... consider the two impotant cases of:

  1. bright objects that are "officially" only in one slice, yet have long PSF diffraction spikes that would leak into adjacent slices, and
  2. A non- or semi-functuonal ADC that "allows" thee position of a point source to wander between image slices at the blue and red extremes of the wavelength ranges.

Unfortunately both of these effects will be prominent in the METIS IFU, and have also been requested by MICADO and MOSAIC (to lesser degrees). My solution for this is the SubApertureList class, but there may be a better way to go about it.

Any way, some food for though for the next telecon :)

Enjoy the remainder of your weekend!

The Field of View handling is a part of ScopeSim that I don't yet fully

understand. Which perhaps is a good thing, because that means it does actually work well enough that I never had to look into it, while it is very important. So maybe this is a good time to learn.

Could you perhaps provide a legend to the figure? Let me try to figure it out.

  • All boxes with boundary look identical, but they seem to be representing different things. Let's see,
    • The top row of boxes (MICADO_LSS etc.) are some abstract 'instrument mode' that is not represented by anything concrete in our code base? Maybe we could list the names of the modes as defined in the IRDB?
    • The second row of boxes (LSS etc) is some abstract ScopeSim mode? Will/should there be something concrete in the IRDB or ScopeSim that represents those?
    • The last three lines of boxes seem to be Effect classes that modify FoV's in some way. Are they all subclasses of the ApertureList class?
  • The gray boxes on the arrows all represent the FieldOfView class? Or a list of those? Or subclasses of the FieldOfView class?
    • The lowest list of gray boxes are 'per FoV', so they are not FoVs themselves?
  • The arrows are the order in which FoV instances are manipulated by the various ApertureList classes?

I don't understand the need for the SubApertureList, which is probably a good point to highlight my misunderstanding of the whole system. It seems to me that it would be fine to see image slicer IFU's as N independent long-slits (each with their own FoV), and then somewhere at the end combine all those FoVs into a single image. Any overlap can be dealt with in this combining phase without any of those long-slit FoVs needing any knowledge of the other long slits. Or is this naive?

Perhaps you could describe your new scheme in a telecon?

— Reply to this email directly, view it on GitHub https://github.com/AstarVienna/ScopeSim/issues/380#issuecomment-1972858846, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACH66QSWW5YKAWPGBOZZIFDYWBFDTAVCNFSM6AAAAABEAVJQLGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZSHA2TQOBUGY . You are receiving this because you were assigned.Message ID: @.***>

hugobuddel commented 8 months ago

Can't these both be solved by overlapping FoVs that are still independent of each other? Then at some point in the process they are combined, resolving the overlap (by summing or cropping or whatever).

  1. bright objects that are "officially" only in one slice, yet have long PSF diffraction spikes that would leak into adjacent slices, and

The diffraction spikes are from the telescope itself and fully captured by the PSF right? If some effects selects a FoV, then the upstream PSF effect should expand that FoV with the size of the PSF so any bright source nearby is fully represented.

  1. A non- or semi-functuonal ADC that "allows" thee position of a point source to wander between image slices at the blue and red extremes of the wavelength ranges.

Here it would then be the ADC effect that realizes that it needs to expand the FoV slightly to include any sources just outside the original FoV.

Both of these would mean that there are overlapping FoVs, but the overlap can be discarded I think. To me this sounds preferable than FoV's that know about each other.

But maybe I'm misunderstanding the approach.