49

My question is the following: Let $f\in C^\infty(a,b)$, such that $f^{(n)}(x)\ne 0$, for every $n\in\mathbb N$, and every $x\in (a,b)$. Does that imply that $f$ is real analytic?

EDIT. According to a theorem of Serge N. Bernstein (Sur les fonctions absolument monotones, Acta Mathematica, 52 (1928) pp. 1–66) if $f\in C^\infty(a,b)$ and $f^{(n)}(x)\ge 0$, for all $x\in(a,b)$, then $f$ extends analytically in a ball centered at $(a,0)\in\mathbb C$ and radius $b-a$!

smyrlis
  • 2,873
  • 1
    Why do you think it would be? Do you have a reason? – András Bátkai Dec 25 '13 at 23:48
  • 1
    @AndrásBátkai: All the examples of $C^\infty$ functions I know, and I can construct, have a lot of zeros in their derivatives. – smyrlis Dec 25 '13 at 23:57
  • 3
    And if you add a suitable analytic function to your example? – András Bátkai Dec 26 '13 at 00:03
  • 1
    @AndrásBátkai: The size of derivatives in all this examples is huge - for example $f(x)=\mathrm{e}^{-1/x^2}$ for $x>0$ and zero for $x\le 0$. Its derivatives become extremely unpleasant near zero. – smyrlis Dec 26 '13 at 00:06
  • 3
    I think $e^{-1/x^2}$ has perfectly reasonable derivatives near 0. If you add $e^x$, you even get something where the derivatives are far from 0. – Anthony Quas Dec 26 '13 at 01:15
  • 14
    I don't understand the votes to close. – Pete L. Clark Dec 26 '13 at 04:12
  • 5
    @AnthonyQuas: Nope, if we take $f(x) = e^{-1/x^2} + e^x$, then $f'$ has a zero near $x \approx -0.59118$. – Nate Eldredge Dec 26 '13 at 06:32
  • 9
    The pattern might not be obvious from Nate Eldredge's comment, but higher derivatives of $\exp(-1/x^2)$ have increasingly large oscillations increasingly close to $0$. Adding something with small derivatives doesn't change this property. – Douglas Zare Dec 26 '13 at 09:18
  • 1
    @Nate: (Not so important, now that the question has been changed.) The smyrlis-Quas example is: $\exp(-1/x^2)+\exp(x)$ for $x \gt 0$ and $\exp(x)$ for $x \le 0$. – Gerald Edgar Dec 26 '13 at 14:42

2 Answers2

56

If $f$ is $C^{\infty}$ every derivative is continuous, so the hypothesis on $f$ implies that each derivative $f^{(n)}$ has constant sign. Such functions were studied by S. Bernstein and called regularly monotonic. In particular he proved in 1926 that a regularly monotonic function is real analytic.

This 1971 AMM article by R.P. Boas provides a proof, more history, and further results along these lines. See also this 1975 PAMS article of J. McHugh.

Pete L. Clark
  • 64,763
  • 3
    A very readable proof of Bernstein's Theorem appears on p. 437 of Volume I (second edition) of Apostol's Calculus. It also seems to be one of the rare occasions one needs the integral form of the remainder formula in Taylor's Theorem. – Ted Shifrin Dec 26 '13 at 20:13
  • 1
    I have always preferred the integral formula for the remainder term, because it's an explicit formula. – Deane Yang Dec 26 '13 at 20:19
  • @Ted: Thanks. I was thinking of putting some material on analytic functions into my Honors Calculus notes, but a lot of the basic facts have rather unrewardingly technical proofs. However this is a beautiful theorem. I'll look to see what Apostol does. – Pete L. Clark Dec 26 '13 at 20:58
  • @Deane: Sure. But rarely does one actually need it. It's pointless just to use the usual $L^\infty$ bound inside the integral. – Ted Shifrin Dec 26 '13 at 21:02
  • 1
    Deane and Ted: The integral form of the remainder is also used to prove the analyticity of Newton's function $(1+x)^{\alpha}$. Come to think of it, this may not be a coincidence... – Pete L. Clark Dec 26 '13 at 21:11
  • I like the integral form, because it can be proved using the Fundamental Theorem of Calculus (which is always taught) instead of the mean value theorem (which is not always taught). – Deane Yang Dec 26 '13 at 21:19
  • Also, I once used the integral form applied to the symbol calculus of pseudodifferential operators, because I needed an exact formula for the error term to get the precise estimates I needed. – Deane Yang Dec 26 '13 at 21:20
  • 1
    Don't get me wrong. I'm glad to see the need for it.:) @Pete, yes, I knew that it was needed for the generalized binomial theorem. Mike Spivak and I batted that one around when I convinced him to de-emphasize the integral form in the latest edition of his book. Waits for missiles to be hurled. – Ted Shifrin Dec 26 '13 at 22:36
  • @Ted: Spivak's Calculus is one of my favorite math texts. Differences between the editions seem minor to me, especially because I own several different editions. (Sorry if that disappoints you...) More sincerely: thank you for your involvement in keeping in print the best calculus text the world has ever seen. – Pete L. Clark Dec 26 '13 at 23:12
  • @PeteL.Clark: I think that the most elementary proof of this theorem appears in the book of W.F. Donoghue, Distributions and Fourier Transforms (1969). In the same book, one can find a lot of unusual theorems, which we never teach! – smyrlis Dec 28 '13 at 01:05
  • The link to the 1971 AMM article by R.P. Boas is broke. Could you please provide the full reference including the title? Thank you. – Hans Apr 26 '20 at 16:59
22

Yes, any such function is analytic.

Assume contrary, let $f$ be such a function. Note that if it is analytic at two points then it has to be analytic everywhere between. So by taking restriction, we may assume that the function is not analytic in any subinterval.

We can assume that $0$ is a point in the interval.

Assume the Taylor series of $f$ at $0$ converges in the $\varepsilon$-neighborhood of $0$. Denote by $\bar f$ its sum. The monotonicity of $f^{(n)}$ gives a bound on the error $f(x)-\bar f(x)$ on one side from $0$; it follows that $\bar f(x)$ converges to $f(x)$ if $0<x<\varepsilon$ or $\varepsilon<x<0$, a contradiction.

It remains to consider the case when the Taylor series of $f$ at $0$ diverges in any neighborhood of $0$. In this case, for any $\varepsilon >0$, there is arbitrary large $n$ such that $|f^{(n)}(0)|>\tfrac{n!}{\varepsilon^n}$. Applying monotonicity of $f^{(n)}$ and integrating, we get that $|f^{(k)}(x)|>2^n$ for any $k\le n$ and some $-4{\cdot}\varepsilon<x<4{\cdot}\varepsilon$, a contradiction.

Andrés E. Caicedo
  • 32,193
  • 5
  • 130
  • 233
  • 4
    You said that Note that if it is analytic at two points then it has to be analytic everywhere between. Let $f(x)=\exp(\frac{1}{x^2-1})$, for $|x|<1$ and $f(x)=0$, for $|x|\ge 1$. The $f$ is analytic in $\mathbb R\smallsetminus{-1,1}$, and not at $x=\pm$. So, for example it is analytic around $x=0$ and around $x=2$, but not in the interval $(0,2)$. – smyrlis Dec 26 '13 at 08:47
  • 1
    @smyrlis, you have to use that the is yours; namely, all its derivatives are monotonic. – Anton Petrunin Dec 26 '13 at 16:06
  • I can not understand your comment. – smyrlis Dec 26 '13 at 16:13
  • @smyrlis, at the end points Taylor series converge to $f$. Since the coefficients in the Taylor series are monotonic, the Taylor series at any point between also converge... – Anton Petrunin Dec 26 '13 at 16:33
  • 9
    That's true - not so obvious though. – smyrlis Dec 26 '13 at 16:41