One-bit analog-to-digital converters (ADCs) are now emerging owing to their high power efficiency, simple structure, and high sampling rate. Even with the strong nonlinear behavior of the ADC, a linear representation of the ADC output can be obtained, which enables the investigation of systems with ADCs in a linear framework. For example, the ADC output can be decomposed into a linear term describing the input and an uncorrelated residual fluctuation with the Bussgang decomposition. However, this linear term alone cannot adequately represent the nonlinearity of the ADC, and the overall system performance degrades in the high signal-to-noise ratio (SNR) region. This paper presents a different approach for decomposing a one-bit ADC output within the stochastic resonance (SR) framework, which utilizes the noises in the input signal. First, we decompose the one-bit ADC output into the symbol-wise expected value and the residual fluctuation. Then, this expected value is further decomposed into linear and nonlinear higher-order terms. We theoretically show the importance of higher-order terms and demonstrate how the higher-order terms contribute to the performance of channel estimation and signal demodulation in a multi-antenna receiver.