21

By a smooth finite dimensional vector space I mean a smooth manifold $M$ together with smooth operations $+ : M \times M \rightarrow M$ and $\cdot : \mathbb{R} \times M \rightarrow M$ turning $M$ into a finite dimensional vector space. Is any linear functional $\phi : M \rightarrow \mathbb{R}$ then necessarily smooth?

Certainly every finite dimensional vector space $V$ has a canonical smooth structure making both the vector space operations and all linear functionals smooth. But I can’t immediately see why $V$ couldn’t also have some other smooth structure keeping the vector space operations smooth but destroying the smoothness of some linear functional.

Tristan Bice
  • 1,327
  • 8
    There is only one Hausdorff vector spaces topology on every finite dimensional real (or complex) vector space. Then all linear functionals $\phi$ are continuous and differentiable with (Fréchet) derivative $\phi'(x)=\phi$ for all $x$. It seems to me that I do not understand where the problem is. – Jochen Wengenroth Oct 12 '23 at 19:56
  • 7
    @JochenWengenroth: The topology doesn't determine the smooth structure. For example, we could have $M=\mathbb R$ with coordinate $t=x^3$ (only chart) and the usual vector space structure. Then $\phi(x)=x=t^{1/3}$ is not smooth (nor are the vector space operations). – Christian Remling Oct 12 '23 at 21:59
  • 1
    Exactly, the identity functional is no longer smooth as a map from this manifold $M$ to $\mathbb{R}$. However, as you say, this doesn't answer my question because addition is also no longer smooth in $M$, as $(\sqrt[3]{s}+\sqrt[3]{t})^3$ is not differentiable at $s=t=0$. – Tristan Bice Oct 12 '23 at 22:05
  • 6
    But in line with Jochen's remark, you could rephrase my question as "is there a UNIQUE smooth structure on every finite dimensional vector space making the vector space operations smooth?" As I mentioned, there is a canonical such smooth structure, but I don't see an obvious reason why it has to be the only one. – Tristan Bice Oct 12 '23 at 22:11

3 Answers3

16

The earlier answer is probably more satisfying since it develops things from scratch, but we can also observe that $+$ makes $M$ an abelian Lie group. The inverse is smooth since we can realize it as $-m=(-1)m$.

Thus $M\cong \mathbb R^m \times (S^1)^n$, as a Lie group. See here; $M$ is connected because $tx$, $0\le t\le 1$, gets us from $0\in M$ to an arbitrary $x\in M$ (or because we assumed it from the beginning). We cannot have any $S^1$ factors since they have torsion.

So $(M,\oplus )\cong (\mathbb R^m, +)$, but then the scalar multiplication $\odot$ on $M$ also corresponds to the standard structure because we can reconstruct $tx$ from the addition, first for $t\in\mathbb Q$ and then for all $t$ by continuity.

12

Here is a short argument you may find more convincing which doesn't seem to assume you know anything but basic linear algebra, the inverse function theorem, and existence/uniqueness of solutions to ODEs.

Let $M$ be a smooth vector space. The tangent space $T_0 M$ is also a vector space. The tangent space $T_0 M$ is functorial for smooth linear maps.

There is a natural transformation $\phi_M: M \to T_0 M$ given by $x \mapsto \gamma_x'(0)$, where $\gamma_x(t) = tx$, meaning that if $f: M \to N$ is a smooth linear map, we have $df_0 \phi_M = \phi_N f$. If we know that $\phi_M$ is a natural isomorphism, then we see that $f$ is invertible if and only if $df_0$ is invertible; because $df_x$ is canonically identified with $df_0$ (as $f$ is linear), we see that when $f$ is a smooth linear map which is invertible as a linear map, $f$ is also a diffeomorphism.


Let me prove that $\phi_M$ is additive; that it respects scaling is similar. That it is invertible follows because there exists a unique solution $\gamma_{v}$ to $\gamma(0) = 0$ and $\gamma'(t) = v$, where we identify $T_{\gamma(t)} M \cong T_0 M$ by translation. (This is just the statement that the exponential map exists; it is inverse to $\phi_M$.)

Lemma. If $P: M \times M \to M$ is the addition map, the map $dP_{(0,0)}$ is given by addition.

Proof. Because $P(x, 0) = x$ and $P(0, y) = y$, we have $dP_{(0,0)}(v,0) = v$ and $dP_{(0,0)}(0, w) = w$. Because $dP_{(0,0)}$ is linear, we see that it's given by adding the two coordinates.

Now $$\phi_M(P(x,y)) = \frac{d}{dt}\bigg|_{t=0} tP(x,y) = \frac{d}{dt}\bigg|_{t=0} P(tx,ty) = dP_0(\phi_M x, \phi_M y) = \phi_M x + \phi_M y.$$ That is, $\phi_M$ is additive.


Now pick a basis $(e_i)$ of $M$. The map $\Bbb R^n \to M$ given by sending $(a_i)$ to $\sum a_i e_i$ is a smooth linear isomorphism. Smoothness uses finite dimensionality and the fact that scaling and addition are smooth. By the above, $df_x$ is an isomorphism for all $x$. Therefore, $f$ is in fact a diffeomorphism. So $M$ is equivalent as a smooth vector space to the Euclidean space of the appropriate dimension.

mme
  • 9,388
  • 1
    Sounds convincing. But is it really obvious, for example, that $x\mapsto\gamma_x'(0)$ is linear? I'm certainly tempted to write $(\gamma_x+\gamma_y)'(0)=\gamma_x'(0)+\gamma_y'(0)$. But the $+$ on the left is really just a somewhat arbitrary smooth binary function on $M$, so calculating $(\gamma_x+\gamma_y)'(0)$ should require the chain rule no? On the other hand, the $+$ on the right is addition in the tangent space, which is intrinsic to the smooth structure of $M$. – Tristan Bice Oct 12 '23 at 23:14
  • 1
    @Tristan It uses naturality of $\phi$ and that $\phi_{M \times M} = \phi_M \times \phi_M$ (which ultimately reduces to the fact that the differential of a product of two maps $\Bbb R \to M$ is the product of the differentials). That's additivity; that it respects scalars is similar. – mme Oct 12 '23 at 23:35
  • 1
    I am using the chain rule in the sense that the differential of a composite is the composite of the differentials, but I don't think I'm making any assumptions about $M$. – mme Oct 12 '23 at 23:37
  • 1
    For smooth maps $a:M\times M\rightarrow M$, $f:\mathbb{R}\rightarrow M$ and $g:\mathbb{R}\rightarrow M$ with $0_M=f(0)=g(0)=a(0_M,0_M)$, I don't think it is generally true that $(a(f,g))'(0)=f'(0)+g'(0)$. But this seems to be what you're assuming, with $a$ being addition on $M$, $f=\gamma_x$ and $g=\gamma_y$. Rather it should be $(a(f,g))'(0)=a'_{0_M,0_M}(f'(0),g'(0))$, by the chain rule. – Tristan Bice Oct 12 '23 at 23:55
  • 1
    @TristanBice You're certainly right that it's not true in general, the point is that $da_0$ is given by addition here. However, that takes a little nontrivial work to argue (more than I realized): this is the first-order term in the Baker-Campbell-Hausdorff formula. The proof is simple, though, and I will add it to the post. – mme Oct 13 '23 at 00:12
  • 1
    OK, I'm convinced now, thanks for the clarification. Incidentally, I guess this means the last part of condition (2) in the characterisation of smooth vector bundles given in https://mathoverflow.net/questions/333485/is-a-smooth-family-of-vector-spaces-always-locally-trivial is then redundant, i.e. smooth $n$-dimensional vector bundles are just smooth submersions $p:E\rightarrow M$ with smooth maps $+:E\times_pE\rightarrow E$ and $\cdot:\mathbb{R}\times E$ making each fibre $E_x$ an $n$-dimensional vector space. This was my original motivation for asking this. – Tristan Bice Oct 13 '23 at 01:30
5

Here is a much shorter approach, which really does reduce to the implicit function theorem (no existence and uniqueness of solutions to Lipschitz ODEs). Can anyone do better and get rid of IFT?

Let $f: M \to N$ be a smooth linear isomorphism (say $n = \dim M = \dim N$). I claim that the inverse of $f$ is also smooth. (Equivalently, I claim that $f$ is a submersion.) This solves your problem, by constructing a smooth linear isomorphism $\Bbb R^n \to M$.

To see this, observe that $df_x$ is canonically identified with $df_0$, using appropriate translation isomorphisms. So $\text{rank}(df_x)$ is constant in $x$. If $\text{rank}(df_x) < n$, I claim that $f$ is not injective, proving the desired result.

To see this: by the constant rank theorem, the map $f: M \to N$ is locally modeled on a linear map $L: \Bbb R^n \to \Bbb R^n$ of rank $r < n$. It follows that $f$ is nowhere locally injective.

mme
  • 9,388