I have a temporal signal which looks like $f(t) = t\eta(t)$, where $\eta(t)$ is a white noise with the mean $\eta_m$ and STD $\sigma$. I want to calculate the corresponding spectrum, $F(\omega)$, take its magnitude, $|F(\omega)|$, and finally compute the STD of $|F(\omega)|$ in terms of the given parameters. I have searched some literatures but I couldn't find anything which helps me directly tackle the problem. So, anyone know how to do this, or know which resources you would refer me to?
The following pictures display $f(t)$ and $|F(\omega)|$.

The one below is the histogram of $|F(\omega)|$ excluding some values around the central peak.

I want to derive a mathematical analysis which can compute the standard deviation of the above histogram.
rand(.)..." -- who told you that's white? ("white" without qualification?) – robert bristow-johnson Dec 02 '15 at 21:20It is $t\eta(t)$ with $\eta(t)$ a randomly distributed noise of non-zero mean.
– nougako Dec 02 '15 at 21:28