2

From this previous post, the real and imaginary parts of the Fourier transform of a zero mean Gaussian are uncorrelated (and i.i.d. Gaussians)

This some how seems counter-intuitive. It seems if the real part has a large intensity, then the imaginary part should be small. I read through the proof linked from the previous post, but I don't have a good intuitive feel from this proof.

Can you provide an intuitive explanation of why this is true?

klurie
  • 178
  • 1
  • 5

1 Answers1

5

Recall the following properties of the Fourier transform:

  • If $x(t)$ is an even function, then its Fourier transform $X(\omega)$ is purely real.

  • If $x(t)$ is an odd function, then its Fourier transform $X(\omega)$ is purely imaginary.

Thus, we can think of the real and imaginary parts of the Fourier transform of a zero-mean Gaussian random process $x(t)$ as the Fourier transforms of two separate inputs $x_e(t)$ and $x_o(t)$: the components of the process that have even and odd symmetry, respectively. We split the process into these components as follows:

$$ x_e(t) = \frac{x(t)+x(-t)}{2} $$

$$ x_o(t) = \frac{x(t)-x(-t)}{2} $$

Note that $x(t) = x_e(t) + x_o(t)$. Moving through this step by step,

  • Your observation was that $\text{Re}\{X(\omega)\}$ and $\text{Im}\{X(\omega)\}$ (the real and imaginary parts of the process's DFT) are uncorrelated.

  • Using the Fourier transform properties mentioned above, we can deduce that $\text{Re}\{X(\omega)\} = X_e(\omega)$ and $\text{Im}\{X(\omega)\} = X_o(\omega)$; the real and imaginary parts of $X(\omega)$ are none other than the Fourier transforms of $x_e(t)$ and $x_o(t)$, respectively.

  • Therefore, your observation is equivalent to saying that $X_e(\omega)$ and $X_o(\omega)$ are uncorrelated.

  • Since the Fourier transform is a one-to-one mapping between the time and frequency domains, I posit that the lack of correlation between $X_e(\omega)$ and $X_o(\omega)$ would imply a lack of correlation between $x_e(t)$ and $x_o(t)$ as well.

What is the correlation between $x_e(t)$ and $x_o(t)$? Simple:

$$ \begin{align} \mathbb{E}(x_e(t)x_o(t)) &= \mathbb{E}\left(\left(\frac{x(t)+x(-t)}{2}\right)\left(\frac{x(t)-x(-t)}{2}\right)\right) \\ &= \mathbb{E}\left(\frac{1}{4}\left(x^2(t) - x^2(-t)\right)\right) \\ &= \frac{1}{4} \left(\mathbb{E}(x^2(t)) - \mathbb{E}(x^2(-t))\right) \\ &= \frac{1}{4} \left(\sigma^2 - \sigma^2\right) \\ &= 0 \end{align} $$

As expected, the even and odd components are uncorrelated. So, to summarize, I would say the following:

  • The real and imaginary components of a Fourier transform correspond to the individual Fourier transforms even and odd components of the input function.

  • For a zero-mean Gaussian random process, these even and odd components are uncorrelated.

  • Therefore, their Fourier transforms (the real and imaginary components that you asked about) are also uncorrelated.

Edit: To address your followup:

  • If $x(t)$ is Gaussian, then its even and odd components $x_e(t)$ and $x_o(t)$ are as well, due to the property that any weighted sum of Gaussian random variables is also Gaussian.

  • If $x_e(t)$ and $x_o(t)$ are Gaussian random processes, then their Fourier transforms $X_e(\omega)$ and $X_o(\omega$) are as well. This follows from the same property as the previous statement; if you look at the transform, you're computing a weighted sum of a bunch of Gaussian random variables.

  • If $X_e(\omega)$ and $X_o(\omega)$ are Gaussian, and they are uncorrelated with one another (as described above), then they are also independent. This is a property of the Gaussian distribution.

Jason R
  • 24,595
  • 2
  • 67
  • 74
  • Thanks! As a follow-up, why are they also i.i.d. Gaussian? – klurie Apr 30 '14 at 05:30
  • 1
    I added some more info to the post. – Jason R Apr 30 '14 at 13:39
  • I think the first added bullet is a bit short. The sum of two independent Gaussian variables is Gaussian, but you're perhaps a bit quick in assuming x(-t) and x(t) are independent. (They are, because the autocorrelation function of x is a delta function) – MSalters May 02 '14 at 16:39
  • @MSalters: I don't make that assumption. Check out the part above where I show that $x_e(t)$ and $x_o(t)$ are uncorrelated. I don't assume that $x(t)$ is independent of $x(-t)$ there. The expectation operator is linear, so the order of expectation and subtraction can be swapped. – Jason R May 02 '14 at 16:43
  • @JasonR: The added bullet states that the sum of two Gaussians is Gaussian, therefore x(t) is Gaussian implies xe(t) is Gaussian. That makes sense: xe(t) = x(t) + x(-t), the sum of two Gaussians. But the real theorem is that the sum of two independent Gaussians is Gaussian. Thus to prove xe(t) is Gaussian that way means proving x(-t) is independent. – MSalters May 02 '14 at 16:52
  • @MSalters: I see what you're saying; good point. You are correct in that the independence of $x(t)$ and $x(-t)$ follows straightforwardly from the whiteness of $x(t)$. – Jason R May 02 '14 at 17:24