Abstract:
Within the framework of a “signal-interference-noise” model, we consider the problem of recognizing two signals from a sequence of vector observations. The asymptotical normality of the log-likelihood ratio is demonstrated, and asymptotically sufficient statistics are found at a small (signal+interference)/noise ratio for an isolated observation. As an example, we consider the case where the interference is a time delay of the signal appearance.