Cells were serum-starved or treated with lovastatin (5 μM), or with roscovitine (5 μM) for 36 hrs prior to IFN-α treatment. The mean activation times (with 95% confidence interval) of IRF9 were 8.9482 ± 0.249 hrs (control), 8.2770 ± 0.2310 hrs (serum-starvation), 7.7281 ± 0.1588 hrs (lovastatin), and 9.8733 ± 0.3424 hrs (roscovitine), respectively. The coefficients of variance (CVs) for activation times of IRF9 were 0.3484, 0.3268, 0.2924, and 0.3868, respectively. The mean activation times (with 95% confidence interval) of USP18 were 13.7266 ± 0.4215 hrs (control), 11.4415 ± 0.3146 hrs (serum-starvation), 9.6591 ± 0.2210 hrs (lovastatin), and 8.7585 ± 0.2450 hrs (roscovitine), respectively. The CVs for activation times of USP18 were 0.3844, 0.3220, 0.3257, and 0.3121, respectively. For serum starvation, the mean IRF9 activation time was decreased by 0.67 hr (−7.5%) and the mean USP18 activation time was decreased by 2.29 hrs (−16.6%); For lovastatin, the mean IRF9 activation time was decreased by 1.22 hrs (−13.6%) and the mean USP18 activation time was decreased by 4.07 hrs (−29.6%); For roscovitine, the mean IRF9 activation time was increased by 0.93 hr (+10.3%) and the mean USP18 activation time was decreased by 4.97 hrs (−36.2%). For serum starvation and lovastatin, while both IRF9 and USP18 activation times were decreased, USP18 activation times were decreased to greater extents, resulting in shorter delay times. For roscovitine, IRF9 activation time was increased and USP18 activation time was decreased, both of which contribute to the decrease in the delay time. But the change in USP18 activation time is five times more dramatic than the change in IRF9 activation time (−4.97 hrs vs +0.93 hr), in agreement with the decrease in USP18 activation time being a major contributor to the change in the delay time under this condition.