This came in through the marx helpdesk. I'll copy the relevant parts from the email exchange below. Essentially, the user wants to simulate off-axis sources, but in some cases they are off the chip in marx, when they are on the chip in the true observations. Most of the cases are fixed by using the right set of chips (which would be easier after #46 is implemented) and setting the correct DetOffsetZ to correct the SIM_Z in the observation to the default pointing for the detector used in the marx simulation (see also #47).
However, in some cases, even that does not work and I don't know why:
Possible causes
I suspect one of two things, but it could be something else entirely:
The observation is very close to the pole (dec < 1 deg). There could be a numerical problem or bug that's only seen that close to the pole. I don't think I've personally looked at an observation like that.
Chandra shows aimpoint drift due to thermal motions between the aspect camera and the HRMA. I've looked into that before, but I can't claim to understand how the different coordinate systems move with respect to each other. That drift is corrected for in mission planning, but I don't know if the RA_NOM/DEC_NOM keywords contain that correction or not. I plotted the position of RA_NOM and DEC_NOM in ds9 on top of the eventlist and it seems to fall into the chip gap, which is odd. MARX does not have a time-dependent aimpoint, so there might be an offset between the nominal (in the header) and the real values that marx does not know about.
Possible workaround
Here is what I plan to do (and it sounds like you are using a similar procedure already): I will take the RA/DEC_NOM and the SIM_X from the header and run a simulation with MARX that uses a large diffuse source that fills the field of view. In effect, I'll get an image of the FOV with some fuzzy edges (due to dither). Then I can compare the corners of the chip in the marx simulation with the chip corners in the observed field and see how much they are offset. Given the offset in arcsec and the ACIS pixel scale, I can calculate how much I need to adjust SIM_Z to make it right.
(Of course, I can get the pixel corners in other ways, e.g. using the WCS in the evt file or directly from the MARX code, I'm just suggesting something that's easy to check in ds9.)
That way, you can do the simulations you need. If you need help with that let me know and I'll see what I can do quickly.
What I will do is to script this analysis and run it on many or even all observations in the archive. That way, I should be able to see where the difference comes from.
This came in through the marx helpdesk. I'll copy the relevant parts from the email exchange below. Essentially, the user wants to simulate off-axis sources, but in some cases they are off the chip in marx, when they are on the chip in the true observations. Most of the cases are fixed by using the right set of chips (which would be easier after #46 is implemented) and setting the correct
DetOffsetZ
to correct theSIM_Z
in the observation to the default pointing for the detector used in the marx simulation (see also #47).However, in some cases, even that does not work and I don't know why:
Possible causes
I suspect one of two things, but it could be something else entirely:
The observation is very close to the pole (dec < 1 deg). There could be a numerical problem or bug that's only seen that close to the pole. I don't think I've personally looked at an observation like that.
Chandra shows aimpoint drift due to thermal motions between the aspect camera and the HRMA. I've looked into that before, but I can't claim to understand how the different coordinate systems move with respect to each other. That drift is corrected for in mission planning, but I don't know if the RA_NOM/DEC_NOM keywords contain that correction or not. I plotted the position of RA_NOM and DEC_NOM in ds9 on top of the eventlist and it seems to fall into the chip gap, which is odd. MARX does not have a time-dependent aimpoint, so there might be an offset between the nominal (in the header) and the real values that marx does not know about.
Possible workaround
Here is what I plan to do (and it sounds like you are using a similar procedure already): I will take the RA/DEC_NOM and the SIM_X from the header and run a simulation with MARX that uses a large diffuse source that fills the field of view. In effect, I'll get an image of the FOV with some fuzzy edges (due to dither). Then I can compare the corners of the chip in the marx simulation with the chip corners in the observed field and see how much they are offset. Given the offset in arcsec and the ACIS pixel scale, I can calculate how much I need to adjust SIM_Z to make it right.
(Of course, I can get the pixel corners in other ways, e.g. using the WCS in the evt file or directly from the MARX code, I'm just suggesting something that's easy to check in ds9.)
That way, you can do the simulations you need. If you need help with that let me know and I'll see what I can do quickly.
What I will do is to script this analysis and run it on many or even all observations in the archive. That way, I should be able to see where the difference comes from.
List of examples which fail
Examples: ObsID, position (Jhhmmss.ss+ddmmss.s)
14273 (12026), J134834.28+262205.9
05776, J100254.52+324039.0
17200 (18704, 18705), J114405.37+195602.0
12475, J134804.34+284025.3