-1

How would one prove that a function is a polynomial? I can't seem to find anything about this on the internet. I would like to know if there are any unique properties that only polynomials can satisfy. Given properties of a function, can these properties be used to prove that a function is a polynomial. I would like to do this without actually constructing the polynomial. What I mean is something like this: One can show that the sine function is not a polynomial without deriving an actual formula. You just use the property that a polynomial has finite degree.

EDIT: My function is only defined on the positive integers. So I am not sure how to use the ideas of complex analysis.

Halbort
  • 1,129
  • 1
    A minor extension of Liouville's theorem: If $f: \mathbb{C} \to \mathbb{C}$ is holomorphic, and $|f(z)| = O(|z|^N)$ as $|z|\to \infty$, then $f$ is a polynomial of degree $\leq N$. Is this the sort of thing you are looking for? – David E Speyer Jun 30 '15 at 21:08
  • Yes exactly David Speyer thank you: +1 – Halbort Jun 30 '15 at 21:09
  • If you differentiate it finitely many times and get 0 everywhere, maybe? – Anthony Quas Jun 30 '15 at 21:10
  • A polynomial of degree $n$ is uniquely determined from its values at $n+1$ distinct points. – Geoff Robinson Jun 30 '15 at 21:10
  • @Geoff Robinson How would I use that without constructing the polynomial. – Halbort Jun 30 '15 at 21:11
  • Lagrange interpolation if you know enough values of your function. You implicitly did this sort of thing with your sin example. A polynomial function which takes the value 0 infinitely often is identically 0, which sin(x) is not. – Geoff Robinson Jun 30 '15 at 21:14
  • Yes but how would I prove that something is a polynomial as opposed to some other type of function that passes through those values. – Halbort Jun 30 '15 at 21:16
  • 1
    A classic strengtening of @AnthonyQuas's answer: suppose that for any $x$ there exists $n=n(x)$ so that $f^{(n)}(x)=0$. Then $f$ is a polynomial. http://mathoverflow.net/questions/34059/if-f-is-infinitely-differentiable-then-f-coincides-with-a-polynomial – Otis Chodosh Jun 30 '15 at 21:20
  • If you have the function value at enough points, and it really is polynomial, then the polynomial given by Lagrange interpolation will eventually stabilize to the right polynomial, and adding extra points and function values will not change it. If that doesn't happen, the function isn't polynomial. – Geoff Robinson Jun 30 '15 at 21:37
  • OK I see what you mean Geoff Robinson. Thank you for your help! – Halbort Jun 30 '15 at 21:38
  • If $f$ is entire and there are at least two values that $f$ takes only finitely many times, then $f$ is a polynomial. – Robert Israel Jun 30 '15 at 23:39
  • @Robert Israel I may be missing something but doesn't $e^x$ satisfy your criteria – Halbort Jul 01 '15 at 00:11
  • @Robert Israel Sorry I feel stupid. – Halbort Jul 01 '15 at 00:16
  • 3
    Only defined on the positive integers? Take Anthony's suggestion, and convert to this: take the difference finitely many times, and get identically zero. The difference $\Delta F$ of $F$ is: $\Delta F(n) = F(n+1)-F(n)$. – Gerald Edgar Jul 01 '15 at 01:01
  • Are there any other ways besides finite differences? – Halbort Jul 01 '15 at 01:05
  • Why was my question downvoted. – Halbort Jul 01 '15 at 01:55
  • 1
    I think perhaps by the way it was phrased. If it were something like: "What conditions on a function imply that it is a polynomial, if I am only allowed to use its values on the integers?" Otherwise it looks like a total rookie question. It is best to specify what sort of function, and what the domain is. Is it defined on $\mathbb{R}$? $\mathbb{C}$? Is it smooth? Analytic? Merely continuous? – David Roberts Jul 01 '15 at 03:04
  • @GeraldEdgar I think you gave the answer and this iterated difference method was used in the old times (Galileo ...). The first time $\Delta^n$ annihilates $F$ is the degree of $F$ plus one. – Duchamp Gérard H. E. Jul 01 '15 at 03:58
  • Why is this off topic? – Halbort Jul 01 '15 at 13:50

2 Answers2

3

If it satisfies a linear difference equation with characteristic polynomial $(t-1)^m $ for some $m$, the sequence is a polynomial.

This is essentially equivalent to the statement by Gerald Edgar in a comment, that taking iterated differences, we eventually reach a constant sequence.

So, more formally, $\{s_n\}$ is a sequence that is given by a polynomial iff there is an $m$ such that $$ \sum_{j=0}^m s_{i+j} (-1)^j\binom{m}{j} = 0$$ for all $i$. Note that $\sum_{j=0}^m t^j (-1)^j\binom{m}{j} =(t-1)^m$, which explains the connection.

For example, my favourite proof that the number of skew semi-standard Young tableaux of shape $n\lambda$, is a polynomial in $n$, uses this observation. I recently put a paper on arxiv that basically uses this idea.

A third way to reformulate this, is that the generating function can be expressed as $$\sum_{j=0}^\infty s_i t^i = \frac{c_0 + c_1t \cdots + c_{m-1}t^{m-1}}{(1-t)^m}$$ for some $c_i$.

2

This is the same answer as given several times already. It is prefaced with a few facts on Taylor series to make it seem familiar. First recall that a real function $f(x)$ well enough behaved at $x=0$ has a Taylor series $$f(x)=\sum_0^{\infty}a_k\frac{x^k}{k!} $$ valid in some interval. And

f(x) is a polynomial if there is an $K$ with $a_k=0$ when $k \gt K$ (and the interval of validity is the entire real line.)

Similarly, any real function $f(n)$ defined on $\mathbb{N}$ (a.k.a. a sequence) has a unique expansion $$f(n)=\sum_0^{\infty}a_k\frac{(n)_k}{k!}.$$

valid on all the domain. And

$f$ is a polynomial (on $\mathbb{N}$) exactly if there is an $K$ with $a_k=0$ when $k \gt K.$

This leaves several things to explain.

For the Taylor series, $a_k=(D^kf)\ (0)$ where $D=\frac{d}{dx}$ is the differential operator which sends $f(x)$ to the derivative $f'(x)$.

For sequences on $\mathbb{N}$, $a_k=(\Delta^kf)\ (0)$ where $\Delta$ is the difference operator which sends the sequence

$f(0),f(1),f(2),f(3),\cdots$

to the sequence of differences

$f(1)-f(0),f(2)-f(1),f(3)-f(2),f(4)-f(3)\cdots$

Also, $(n)_k=n(n-1)(n-2)\cdots(n-k+1)$ is the falling factorial. Of course the question concerned $\mathbb{Z}$ rather than $\mathbb{N}.$

For the domain $\mathbb{Z},$ we can first restrict to $\mathbb{N}$, determine if we have a polynomial and, if so, then check that the expansion is valid on all of $\mathbb{Z}.$

The expressions $\frac{(n)_k}{k!}$ were used point out the relation to Taylor series. Note, however, that $\frac{(n)_k}{k!}=\binom{n}{k}$ is a binomial coefficient.

Examination of these sequences $\binom{n}{k}$ make it clear that they constitute a basis (of sorts) for the space of sequence defined on $\mathbb{N}.$ The first few are

$$\begin{array}{cccccccc} 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 \cdots \\ 0 & 1 & 2 & 3 & 4 & 5 & 6 & 7\cdots\\ 0 & 0 & 1 & 3 & 6 & 10 & 15 & 21\cdots \\ 0 & 0 & 0 & 1 & 4 & 10 & 20 & 35 \cdots\\ 0 & 0 & 0 & 0 & 1 & 5 & 15 & 35\cdots \end{array}$$ Clearly there is a unique linear combination of these sequences for any target sequence. Although the sum is potentially infinite, it is finite for each fixed value of $n.$

Furthermore, $f$ takes integer values on all of $\mathbb{N}$ exactly if the $a_k$ are all integers.