Make a short-cadence light curve with one single sinusoid in it with a period around 5.55 min, plus gaussian noise of known variances. Do the SPG, and see that we can retrieve the correct amplitude and frequency.
Then bin up (average) the data by 2s, 4s, 8s, and so on and show that the amplitude and frequency we infer doesn't change.
This test has to work or else the SPG has a bug in it. There should be no attenuation of the signal as we bin up, just noisification. Right?
Make a short-cadence light curve with one single sinusoid in it with a period around 5.55 min, plus gaussian noise of known variances. Do the SPG, and see that we can retrieve the correct amplitude and frequency.
Then bin up (average) the data by 2s, 4s, 8s, and so on and show that the amplitude and frequency we infer doesn't change.
This test has to work or else the SPG has a bug in it. There should be no attenuation of the signal as we bin up, just noisification. Right?