Main Content

Time-Delay Channel Estimation Through Adaptive Filtering

This example shows how to adaptively estimate the time delay for a noisy input signal using the LMS adaptive FIR algorithm.

Assume a signal $s[n] = a[n]+w[n]$ where $w[n]$ is a white Gaussian process and $a[n]$ is deterministic. The signal is measured with an echo of $M$ samples and attenuation $\alpha$ (both are unknown), resulting in the overall measurement:

$$ x[n] = s[n] + \alpha s[n-M] $$

The goal is to estimate the delay $M$ and the echo attenuation $\alpha$. One can determine these parameters by solving the filter identification problem $x = h*s$ for $h$, combined with the prior $h[n] = \delta[n]+\alpha s[n-M]$. Provided that the filter $h$ can be identified from the measurements signal $x$ and the original signal $s$, one can derive $\alpha$ and $M$

Such a filter identification problem can be posed in terms of adaptive LTI filtering. The reference signal is $d[n] = x[n]$, the input feed is $s[n]$, and the adaptive filter is $w$. Clearly, if the adaptation process concludes with $w\to h$ then the error signal $x - w*s = (h-w)*x$ vanishes.

There are numerous adaptive filtering algorithms. For this paricular problem setup and signal model, the normalized LMS algorithm is suitable, and is available in the LMS Filter block.

Run the simulation. The peaks in the filter taps vector indicates the time-delay estimate. In this case $M=8$ and $\alpha = \frac{1}{2}$.

For details, see S. Haykin, Adaptive Filter Theory, 3rd Ed., Prentice-Hall 1996.

Related Topics