Random telegraph noise (RTN) is a common phenomenon in semiconductors, and it is often desired to quantify the effect of the observed RTN on circuits. In this paper we show that the series form of the characteristic function method of nonlinear analysis is suitable for modeling and predicting the output correlation and spectrum of a combination of signals and RTN. We derive the general characteristic function for RTN defined by a single transition parameter, and its spectrum identified as Lorentzian (Cauchy), proportional to the reciprocal of the square of frequency. We then show how the output spectrum is a weighted sum of the respective spectra of the input sinusoid and RTN. Using a simple large-signal model of a MOSFET amplifier, we compute the contributions to the output spectrum as SNR and sinusoid amplitude are varied, and present numerical results. The procedure is easily adapted for other signals and nonlinear circuits.