3F3 Statistical Signal Processing Tripos Revision

Posted by Jingbiao on April 10, 2021, Reading time: 2 minutes.

\( \require{amstext} \require{amsmath} \require{amssymb} \require{amsfonts} \)

Full notes can be viewed over here: rendered version and LaTex Source Code

Markov Chain


  • Proof of Markov Property: \( \textrm{Proving that: } P(X_{n+1}|X_n,X_{n-1},…,X_{1}) = P(X_{n+1}|X_{n}) \)
  • Proof of 2-Step Markov Chain: \( P(X_{n}|X_{n-2}) \)

Characteristic Function

  • Relation with Fourier Transform: Replace $t$ with $-f$
  • Important results with n’th order derivative

Random Process

  • SSS vs WSS:
    • For SSS(Strict Sense Stationary): The probability distribution does not change over time.

    • For WSS: The mean of $x_n$ is constant, independent on time. The autocorrelation only depends on the time difference.

    • SSS requires any two section of the process to be exactly the same, WSS only requires first and second order statistics to be the same. SSS is too strong for most real world application, therefore WSS is used for most applications.

  • Zero Mean White Noise
    • Zero Mean $E[\epsilon]=0$

    • $r_{XX}[m] = \sigma^2 \delta(m)$ The autocorrelation function is a delta function, means uncorrelated if $m \not= 0$

    • Power spectra is also $\sigma^2_X$

  • Non-zero mean white noise:
    • Autocovariance: \( E[(W_n - \mu)(W_{n+m} - \mu)] = \sigma^2 \delta(m) \) which is the autocorrelation function of the zero mean noise
    • Autocorrelation function: \( R_w(m) = \sigma^2 \delta(m) + \mu^2 \) which is the acf of the zero mean plus the mean squared
    • Power Spectrum: \( S_w(f) = \sigma^2 + \mu^2 \delta(0) \) which is the power spectrum of the zero mean case plus the mean square at zero frequency (can be seen as DC offset)
  • Remember that the mean-square value for a process is just the autocorrelation at $\tau=0$

Random Process with LTI system

  • Valid for both continuous and discrete time process

  • Define input and output for the LTI system with WSS processes:

\( y(t) = h(t) \star x(t) = \int h(\beta) x(t-\beta)d\beta \)

this is the continuos domain convolution

  • PSD relating input and output:

\( S_Y(\omega) = S_X(\omega)|H(\omega)|^2 \)

$H(\omega)$ is the fourier domain transfer function

  • Relating PSD and autocorrelation function:

\( r_{YY}[\tau] = \frac{1}{2\pi}\int_{-\infty}^{\infty}S_{Y}(\omega)e^{j\omega\tau} d\omega \)

\( r_{YY}[0] = \frac{1}{2\pi}\int_{-\infty}^{\infty}S_{Y}(\omega) d\omega \)

The autocorrelation is the Inverse fourier transform of the PSD

  • Cross Correlation: \( \begin{align} r_{XY}[t_1,t_2] =& E[X(t_1)Y(t_2)] \newline =& E[X(t_1)\int h(\beta) X(t_2-\beta)d\beta ] \newline =& \int h(\beta)r_{XX}(t_1,t_2-\beta)d\beta \newline =& h(\tau)\star r_{XX}(\tau) \end{align} \) where $\tau = t_2 - t_1$

Estimation

  • Bias of the estimator $\hat{\theta}: \( E[\hat{\theta}] - \hat{\theta} \)
  • Variance of estimator $\aht{\theta}$: \( E[(\theta - E[\theta])^2] \)