6

Suppose $(x_i)_{i\in\mathbb{N}}$ a set of strictly positive numbers such that $L=\sum_{i\in\mathbb{N}}x_i$ is finite.

Suppose that $(X_i)_{i\in\mathbb{N}}$ is a set independant (real-valued) random variables, each uniformly distributed in $[-x_i;x_i]$.

I am interested in $S_n=\sum_{i=0}^nX_i$ when $n\rightarrow\infty$.

$S_n$ is clearly in $[-L;L]$, so I guess the Central Limit theorem can't apply here ($\sum_{i\in\mathbb{N}}x_i^2$ is finite), so $S_n$ doesn't converge to $\mathcal{N}(0,?)$ since the probability density function of $S_n$ has a finite support ($[-L;L]$).

Can someone help me to find the distribution of $S_n$ when $n\rightarrow\infty$ (is it something like $\mathcal{N}(0,?)$ restricted to $[-L;L]$ ?)

Thanks

  • 2
    Imagine $x_1 = 1$ and, say, $x_j = e^{-100j}$ for $j > 1$. Then $S_n$ is basically a uniform $[-1,1]$ variable plus the tiniest bit of additional noise. – usul Sep 11 '17 at 16:33
  • 2
    I suppose the answer is that it's the limit of the closed form expressions in this paper: https://link.springer.com/article/10.1007/s00362-007-0049-4 – Alexander Pruss Sep 11 '17 at 18:07
  • @AlexanderPruss: I were not aware of this paper, but of some other pointing the same kind of result. But I still don't know what to do with that. The prob. density function is the sum of 2^n polynomial (at least degree n), and I am able to compute it up to n=25, but not further... With simulation, the pdf "looks like" a truncated gaussian curve, but I have no idea if it is (like CLT says when it's available) or not – ThibThib Sep 13 '17 at 12:37
  • What are you doing with the distribution? For numerical purposes, the CLT idea may be good enough: taking x_i = (i<=12) / 2 is a standard numerical approximation to the normal. –  Sep 15 '17 at 12:22

3 Answers3

4

The saddle-point method can give a large-$n$ approximation of the distribution of $S_n$. This has been worked out by Murakami in A saddlepoint approximation to the distribution of the sum of independent non-identically uniform random variables (2014). The leading order expression (with $1/n$ corrections) is given in section 2 of the paper.

To find the distribution $p(v)$ for the sum you need to solve the saddle-point equation $\kappa'_n(s)=v$ for $s$, with $$\kappa_n(s)=-n\log s-\sum_i^n\log a_i+\sum_i^n\log[e^{a_i s}-1].$$ The random variables are taken uniformly distributed in $(0,a_i)$, $i=1,2,\ldots n$. This equation can be solved numerically by the Newton–Raphson algorithm. The desired distribution then follows from $$p(v)=[2\pi\kappa''_n(s)]^{-1/2}\exp[\kappa_n(s)-sv],$$ upon insertion of the saddle-point value of $s$.

Carlo Beenakker
  • 177,695
2

By the Kolmogorov three series theorem or some of its corollaries, it is known that $S_n$ converges almost everywhere and in $L^2$ norm to a random variable $Y$ whose characteristic function can be computed explicitely.

$$\phi_{S_n}(t) = \prod_1^n \phi_{X_j}(t) = \prod_1^n {\sin(t x_j)\over t x_j}$$

Hence

$$\phi_Y(t) = \prod_1^\infty {\sin(t x_j)\over t x_j}$$

From the characteristic function of $Y$, one may recover e.g. the distribution function of $Y$ using Fourier inversion formula. I don't think that a more explicit form for $\phi_Y$ can be obtained without further information on the sequence $(x_j)$.

coudy
  • 18,537
  • 5
  • 74
  • 134
2

There is no central limite theorem here. But never mind, this is a wonderful example of $C^\infty$ function nowhere analytics !

https://en.wikipedia.org/wiki/Fabius_function

What are your favorite instructional counterexamples?

RaphaelB4
  • 4,321