0

How do I solve $\lim$ as $x$ goes to infinity of $(\frac{1}{x})^{\frac{1}{x}}$ without appealing to L'Hôpital?

Note: If I take natural logs of both sides, I eventually must invoke L'Hôpital.

The best idea I've seen so far is using the Squeeze Theorem, but I have been unable to come up with functions that will squeeze $(\frac{1}{x})^{\frac{1}{x}}$.

2 Answers2

4

I would suggest a variable change. let $y=\frac{1}{x}$

and so, switch $\lim_{x \to \infty} (\frac{1}{x})^{\frac{1}{x}}$ to $\lim_{y \to 0} y^y$

notice that $$\lim_{y \to 0} y^y = \lim_{y \to 0} e^{y \ln y}$$

since $e^x$ is continuous at $0$, we can infer the limit tends to $1$.

No L'Hopital :)

Oria Gruber
  • 12,739
  • 7
    But how can you explain that $;x\log x\xrightarrow[x\to 0^+]{}0;$ without l'Hospital? – Timbuc Sep 17 '14 at 22:08
  • 1
    There's gotta be a line where you say "Ok, I know it so I can use it" without having to prove it again :) – Oria Gruber Sep 17 '14 at 22:10
  • 1
    http://math.stackexchange.com/questions/522973/lim-x-to0-x-ln-x-without-lhopitals-rule – Poppy Sep 17 '14 at 22:15
  • @Timbuc: To show that $x\log x \to 0$ as $x \to 0^{+}$ you need to use properties of the log function. Any such property would require a proper definition of logarithm. You can put $y = 1/x$ and then $x\log x = -(\log y)/y$ and $y \to \infty$. Then use the fact that $\log y < 2(\sqrt{y} - 1) < 2\sqrt{y}$ so that $$0 < \frac{\log y}{y} < \frac{2}{\sqrt{y}}$$ for $y > 1$ and by squeeze theorem yoy get $\lim_{y \to \infty}(\log y)/y = 0$ and hence $x\log x \to 0$ as $x \to 0$. – Paramanand Singh Sep 18 '14 at 01:30
  • @Timbuc: Based on the upvotes of your comment, I don't see why people are so used to believing that "you can't explain $x\log x \to 0$ as $x \to 0^{+}$ without L'Hospital". Almost any first order limit is possible without L'Hospital and Taylor and many seemingly higher order limits can be reduced to an expression consisting of first order limits. So that in most usual limit problems LHR or Taylor is unnecessary. – Paramanand Singh Sep 18 '14 at 01:36
  • I didn't say it is impossible to show that limit without KHR, @ParamanandSingh...but the proof you give is very convoluted and I can't see where did you get that inequality from: $$\log y< 2(\sqrt y-1);;?$$ – Timbuc Sep 18 '14 at 02:38
  • @Timbuc: there are many definitions of $\log x$ and one definition is $\lim_{n \to \infty}n(x^{1/n} - 1) = \log x$. When we work with this definition then we show that for $x > 1$ the sequence $n(x^{1/n} - 1)$ decreases when $n$ increases and the limit as $n \to \infty$ is defined as $\log x$. Since the sequence is decreasing it follows that $\log x < n(x^{1/n} - 1)$ for all positive integers $n $ and $x > 1$. Putting $n = 2$ we get the desired result. – Paramanand Singh Sep 18 '14 at 04:07
  • @Timbuc: If we go by definition $$\log x = \int_{1}^{x}\dfrac{dt}{t}$$ then it is much easier to handle. Clearly when $x > 1$ and $1 < t < x$ then $1/t < 1/\sqrt{t}$ and hence $$\log x = \int_{1}^{x}\frac{dt}{t} < \int_{1}^{x}\frac{dt}{\sqrt{t}} = 2(\sqrt{x} - 1)$$ So in both cases the inequality for $\log x$ follows directly from the definitions and is not convoluted at all. Or perhaps you are used to some other definition. – Paramanand Singh Sep 18 '14 at 04:10
  • 1
    @ParamanandSingh, in the first definition, how would you prove that $;{n(x^{1/n}-1)};$ is monotone descending and bounded below? The integral definition indeed makes thing easier as far as concerns that inequality, yet I still think the whole thing is pretty convoluted, but the point is proved: not LHR or Taylor involved, though integrals are usually studied after these two. – Timbuc Sep 18 '14 at 09:17
  • @Timbuc: The rigorous definitions of $\log x$ is not easy to handle, especially for a beginner in calculus. Using LHR makes things simpler but it does not help with rigorous definitions of $\log x$. BTW if $x > 1$ then $x^{1/n} > 1$ so that $n(x^{1/n} - 1) > 0$ and it is bounded below. The decreasing nature is bit tricky with algebra (see "Logarithm as limit" in http://paramanands.blogspot.com/2014/05/theories-of-exponential-and-logarithmic-functions-part-2_10.html). Otherwise integrate the inequality $$t^{a - 1} < t^{b - 1}$$ for $0 < a < b$ over $[1, x]$ and use $a = 1/(n + 1), b = 1/n$. – Paramanand Singh Sep 18 '14 at 13:27
  • @Timbuc: While working some problem on MSE I found a very easy proof of $\log x < n(x^{1/n} - 1)$ for all $x > 1$ and positive integers $n$. Note that for $n = 1$ the result $\log x < x - 1$ is almost obvious via any definition of $\log x$ (this is equivalent to $e^{y} > 1 + y$ for all $y > 0$). Let us then suppose that $x^{1/n} = t$ so that $t > 1$. And then $$\log x = \log (t^{n}) = n\log t < n(t - 1) = n(x^{1/n} - 1)$$ so that the result for all positive integers $n$ follows from a simple application of the same result for $n = 1$. – Paramanand Singh Oct 26 '14 at 09:38
1

Firstly; note it converging to $1$ equivalent to having $x^{1/x} \rightarrow 1$ as $x\rightarrow \infty$.

Try to imitate the proof that $(n^{1/n}) \rightarrow 1$ (as a discrete limit on the naturals). This goes as follows; it can be shown by the Mean Value Theorem, that whenever $p\geq 1$ and $y\geq 0$, $(1+y)^p \geq 1 + py$. Now define $a_n = n^{1/n} - 1$. Observe that for $n\geq 2$, $$\begin{align}\sqrt n &= (1 + a_n)^{n/2} \\ &\geq 1 + \frac{n}{2}a_n. \end{align}$$ Hence $$0 \leq a_n \leq 2\frac{\sqrt n - 1}{n},$$ so $(a_n) \rightarrow 0$ and the result follows.

What changes need to be made to this proof to show it for a continuous limit as $x\rightarrow \infty$?

Matt Rigby
  • 2,316
  • 12
  • 13