7

Let $f,g$ be convex functions on $[0,\infty)$ such that $\lim_{x\to\infty}\frac{f(x)}{g(x)} = 1$ and $\lim_{x\to +\infty} g(x) = +\infty$.

Is it always true that $\lim_{x\to \infty} \frac{f(x+1)-f(x)}{g(x+1)-g(x)} = 1$?

I can prove it when $g(x)=x$ and $g(x) = x^2$.

Edit. It is actually not so hard to show it works more generally when $g(x) = x^\alpha$ for any $\alpha \geq 1$.

The question can also be asked with series as a partial converse to the Stolz-Cesaro Theorem:

Let $a$ and $b$ be increasing sequences such that and $\displaystyle\lim_{n\to\infty} \sum_{k=1}^n b_k = +\infty$.

Does $\displaystyle\lim_{n\to\infty} \dfrac{\sum_{k=1}^n a_k}{\sum_{k=1}^n b_k} = 1$ imply $\displaystyle\lim_{n\to\infty} \dfrac{a_n}{b_n} = 1$?

Rhubarbe
  • 113
  • 1
    If, for example, $g(x+1)/g(x)$ tends to a finite limit $a \neq 1$, then the statement is true. Since $g(x)$ tends to infinity and is convex, it is monotonic from a point $x^$. Thus $$ \frac{f(x + 1) - f(x)}{g(x + 1) - g(x)} = \frac{\frac{f(x + 1)}{g(x + 1)} - \frac{f(x)}{g(x)}\frac{g(x)}{g(x + 1)}}{1 - \frac{g(x)}{g(x + 1)}} \to \frac{1 - a}{1 - a} = 1. $$ Note that the fraction on the left is well-defined for $x>x^$. – Gary Jan 23 '20 at 11:15
  • That is a good point, but I'm actually mostly interested by $g(x) = x^\alpha$, in which case $\lim_{x\to\infty} \frac{g(x+1)}{g(x)} = 1$. – Rhubarbe Jan 23 '20 at 11:54
  • I have an idea but not sure whether it will work or not. We are fixing $g(x)=x^{\alpha}, \alpha\in \mathbb{N}.$ First prove it when $f$ is a polynomial (this can be done using the mean value theorem). Next, try to approximate $f$ uniformly by a sequence of polynomials ${P_n}$. Since uniform convergence gives $|f/P_n -1|_\infty \to 0,$ I think it could be done. – Aditya Ghosh Jan 23 '20 at 17:48
  • @Rhubarbe could you elaborate on how you proved the $g(x)=x^\alpha$ case? – Hotdog2000 Dec 08 '20 at 14:10

1 Answers1

2

I don't think so.

Counter example for the $a_k$, $b_k$ case: $$ a_k = 2^{\lfloor \log_2(k) \rfloor} \quad\quad \text{and} \quad\quad b_k = 2^{\lfloor \log_2(k+1) \rfloor} $$

So $a_k$ and $b_k$ are the same in most cases, except when $k=2^n-1$, we have $a_k=2^{(n-1)}, b_k=2^n$.

So apparently $a_{k+1}\geq a_k$, $b_{k+1}\geq b_k$, and $\lim_{k\to\infty}\frac{a_k}{b_k}$ does not exist

On the other hand, it can be shown that
$$ b_k\geq a_k\geq k/2\text{, So }\sum_{k=1}^N b_k \geq \sum_{k=1}^N a_k = \mathcal{O}(N^2) $$ $$ \sum_{k=1}^N b_k - \sum_{k=1}^N a_k \leq N+\frac{N}{2} + \frac{N}{4} + .... \approx 2N $$

Thus $$ \lim_{N\to\infty} \frac{\sum_{k=1}^N b_k - \sum_{k=1}^N a_k}{\sum_{k=1}^N b_k} = \lim_{N\to\infty}\mathcal{O}(\frac{N}{N^2}) = 0 $$

In summary, in this example, $a_k$ and $b_k$ are increasing sequences with $\sum_{k=1}^N a_k = + \infty$, and $\sum_{k=1}^N b_k = + \infty$,

We also have $$ \lim_{N\to\infty} \frac{\sum_{k=1}^N a_k}{\sum_{k=1}^N b_k} =1 $$

but $\lim_{k\to\infty}\frac{a_k}{b_k}$ does not exist.

MoonKnight
  • 2,179