Abstract:
Fluxes emitted at different wavebands from active galactic nuclei (AGNs)
fluctuate at both long and short timescales. The variation can typically be
characterized by a broadband power spectrum, which exhibits a red-noise process
at high frequencies. The standard method of estimating power spectral density
(PSD) of AGN variability is easily affected by systematic biases such as
red-noise leakage and aliasing, in particular, when the observation spans a
relatively short period and is gapped. Focusing on the high-frequency PSD that
is strongly distorted due to red-noise leakage and usually not significantly
affected by aliasing, we develop a novel and observable normalized leakage
spectrum (NLS), which describes sensitively the effects of leaked red-noise
power on the PSD at different temporal frequencies. Using Monte Carlo
simulations, we demonstrate how an AGN underlying PSD sensitively determines
the NLS when there is severe red-noise leakage and thereby how the NLS can be
used to effectively constrain the underlying PSD.