3

I have read that variance of a deterministic signal is 0 because it is deterministic and does not vary over time.

In this explanation, I am considering a $\sin(t)$ wave where value at each t is a variable(it will be a random variable when we are doing measurements but I am not sure what to call it here, so just variables). We can have sampled values of $\sin(t)$ for limited number of variables but it is not related to my question.

I understand very well for a single variable that it's variance is zero. For eg: value of sine wave at $t=1$ which is given by $x=\sin(1)$ . Since $\sin(1)$ is constant every time you try to measure it, variance given by the following formula is zero as well: $$ var(x)= E[(x-\bar x)(x-\bar x)^T] $$ I understand this is zero because $x=\bar x$ every time you calculate x. Now, suppose you have a deterministic signal($\sin(x)$ from $0$ to $2\pi$). Now, mean of the signal is 0 and variance is just the squared sum of signal values from $0$ to $2\pi$ which is not zero. Even more, this is providing the energy of the signal. Does that mean variance of a deterministic signal is defined and does not have to be zero. Is variance even defined for a whole signal or is this concept valid only for a single random variable.

Matt L.
  • 89,963
  • 9
  • 79
  • 179
  • 2
    Expectation and average aren’t the same in this context. This is not an ergodic process. E{x(t)}=E{E{x(t)|t}} and if t is not random, it just equals x(t). You can calculate an RMS value for x(t) but it isn’t the square of the variance. The term “mean” is often taken in context. Ensemble average and time average are defined differently. –  Jan 13 '18 at 15:18
  • I guess I am thinking in the wrong way here, but here is what I was thinking: – Pranav Prakash Jan 14 '18 at 13:13
  • Ignore my last comment. Sent by mistake. So, I finally get it. We can still assign the probability distribution here but we will have a joint probability where probability is assigned 1 to that vector where each value corresponds to the actual value of $sin(t)$ at any particular $t$ and 0 everywhere else. So expectation will give $1\cdot x(t)$. Am I thinking right? – Pranav Prakash Jan 14 '18 at 13:22

2 Answers2

2

To help you see it better, you could look at the bigger picture $$y(t) = x(t) + \epsilon$$ where $\epsilon \sim \mathcal{iid}(0,\sigma^2)$ In this case, we can say that $$E(y(t)) = E(x(t)) + E(\epsilon)$$ where the expectation $E()$ is over realizations and not over time, hence $E(x(t)) = x(t)$ and therefore $E(y(t)) = x(t)$. The variance is $$var(y(t)) = E \big(y(t) - E(y(t))\big)^2 = E(\epsilon^2) = var(\epsilon) = \sigma^2$$ When $\sigma = 0$, we say that we have a deterministic signal and hence the variance of $y(t)$ is zero.

Ahmad Bazzi
  • 730
  • 3
  • 13
1

If $x(t)$ is deterministic its variance is zero. Note that the mean of $x(t)$ (defined as an expectation) just simply equals $x(t)$:

$$\mu_x(t)=E\{x(t)\}=x(t)\tag{1}$$

where the last equality follows from the fact that $x(t)$ is deterministic. From $(1)$ we get the variance

$$E\{[x(t)-\mu_x(t)]^2\}=E\{[x(t)-x(t)]^2\}=0\tag{2}$$

Matt L.
  • 89,963
  • 9
  • 79
  • 179
  • It's an interestging question because, in time series econometrics, the literature-texts always that the variance of the time series is equal to $\frac{1}{2 \pi}$ $f(0)$ where $f$ denotes the spectral density. The intuition behind that has always evaded me because how could variability around a mean squared be equal to a normalized frequency. ? Thanks for any insights- other than what the textbooks say. – mark leeds Jan 13 '18 at 19:01
  • Hi Matt: I was just going to correct that: what I meant was the following: the variance of the series is seen ( estimated whatever you want to say ) as $\hat{\sigma}^2 = 2\pi f(0) $. See this link for more information: (page 10) ).http://www.phdeconomics.sssup.it/documents/Lesson19.pdf – mark leeds Jan 13 '18 at 19:08
  • Matt: Just so my comment is clear. I understand the math behind the derivation for the relation in that the auto corr, $R(\tau)$ is the inverse fourier transform so you put $\tau = 0$ in and out comes the relation. I'm more lost on how a variance, which to me represents squared debviation around a fixed known mean, can be stated in terms of a density at frequency zero. Thanks. – mark leeds Jan 13 '18 at 19:23
  • 1
    @markleeds: Maybe it's a good idea to make a question out of this since I don't think it's closely related to this question, and it's also more useful for other users to have a separate question and answer, not just comments. – Matt L. Jan 13 '18 at 19:25
  • good idea matt. I'll make my question a new question either tonight or tomorrow. and use better notation also because, as you said, the $f(0)$ does represent A) the sum over all the frequencies in the discrete case and B) the integration over the frequency band $[-\pi,\pi]$ in the continuous case. – mark leeds Jan 14 '18 at 04:09
  • I was thinking in terms of how you calculate the variance of scores in a class. If you plot the marks obtained by each student on a graph, it has a variance, Now my doubt was that if you have a deterministic signal which looks like that particular distribution then even the function will have a variance. However, I think I was confused between random variables and random process. In the deterministic signal, I think you have to assign probability distribution to the whole vector X which will be 1 for that X which corresponds to actual value of the function. Hence, expectation is X. – Pranav Prakash Jan 14 '18 at 13:41
  • @MattL., i think what the OP is looking for is described in this math.se post: https://math.stackexchange.com/questions/1026366/distribution-of-sine-of-uniform-random-variable-on-0-2-pi – robert bristow-johnson Nov 11 '18 at 07:01
  • i came late to this discussion. i missed it originally. – robert bristow-johnson Nov 11 '18 at 07:02
  • @robertbristow-johnson: Maybe, but I'm not able to read that from the question. Anyway, the OP now has the link if he's still interested. – Matt L. Nov 11 '18 at 17:11
  • variance of a deterministic signal sampled at a totally random time. think about variance and the AC component of power as the same thing. – robert bristow-johnson Nov 12 '18 at 07:35