8

I don't understand this from a textbook.

"Dual space could have an inner product that is induced from the vector space."

Suppose there is a vector space $V$. The inner product is determined $<v,v>=v^i v^i$ or in matrix form

$$\begin{bmatrix} v^1 & v^2 & v^3\\ \end{bmatrix} \begin{bmatrix} v^1 \\ v^2 \\ v^3 \\ \end{bmatrix}$$

How the dual space inner products are constructed in a similar way?

Tursinbay
  • 307

2 Answers2

19

Let $V$ be a finite-dimensional vector space over the field $\Bbb{R}$, and let $g:V \times V \to \Bbb{R}$ be an inner product on $V$. (I write $g$ rather than $\langle \cdot, \cdot\rangle$ simply for convenience.)

Then, recall that the dual space $V^*$ is by definition the set of all linear transformations from $V$ into $F$. Now, using the inner product $g$ on $V$, we can contruct the following map: $g^{\flat}:V \to V^*$ defined by \begin{align} g^{\flat}(x) = g(x, \cdot) \end{align} In other words, $g^{\flat}$ assigns to each vector $x \in V$, that element of $V^*$, such that for all $y \in V$, $\left(g^{\flat}(x) \right)(y) = g(x, y)$.

Now, using the fact that $g$ is an inner product, it is easy enough to verify (just unwind all the definitions) that $g^{\flat}$ is a linear map, and it is also injective. Hence, by rank-nullity theorem, it is also surjective, hence $g^{\flat}:V \to V^*$ is an isomorphism of finite-dimensional vector spaces. Now, let's denote the inverse of $g^{\flat}$ by $(g^{\flat})^{-1} \equiv g^{\sharp}:V^* \to V$

Now, you can define the function $h$ on $V^*$ as follows: define $h:V^* \times V^* \to \Bbb{R}$ by \begin{align} h(\phi, \psi) &:= g \left( g^{\sharp}(\phi), g^{\sharp}(\psi)\right) \end{align} I'll leave it to you to verify that $h$ satisfies all the properties of an inner product.

Note that while this is a lot of constructions, the idea is actually very simple. To define an inner product on $V^*$ means you need to tell me how to construct a number from two elements $\phi, \psi \in V^*$. Well, the above recipe says first "convert" $\phi, \psi$ from "dual vectors" into vectors by applying $g^{\sharp}$ to them. Then, since $g^{\sharp}(\phi)$ and $g^{\sharp}(\psi)$ are vectors in $V$, we can take their inner product using $g$ to get a number.


The above answer is the completely basis-free construction of how to get an inner product on $V^*$ from the one on $V$. Now, if we resort to a basis, the computations is actually very simple. Let $\{e_1, \dots, e_n\}$ be a of $V$, which is orthonormal with respect to the inner product $g$ (i.e $g(e_i, e_j) = \delta_{ij}$). Also, let $\{\epsilon^1, \dots, \epsilon^n\}$ be the unique dual basis of $V^*$, which means that by definition, for all $i,j$, we have that $\epsilon^i(e_j) = \delta_{ij}$. It is easy to verify that $\epsilon^i = g^{\flat}(e_i)$, and hence, $\{\epsilon^1, \dots \epsilon^n\}$ will be an orthonormal basis of $V^*$ with respect to the inner product $h$.


So, for computations, if $\{e_1, \dots, e_n\}$ is an orthonormal basis of $V$, then to compute $g(x,y)$ what we can do is first expand $x,y$ in terms of the basis: \begin{align} x = \sum_{i=1}^n x^i e_i \quad \text{and} \quad y = \sum_{i=1}^n y^i e_i \end{align} ($x^i, y^i \in \Bbb{R}$ being scalars). Then, (by orthonormality) \begin{align} g(x,y) = \sum_{i=1}^n x^i y^i \end{align}

Now, the inner product on the dual space is not that much different: given $\phi, \psi \in V^*$, to compute $h(\phi, \psi)$, what you can do is first expand them in terms of the dual basis $\{\epsilon^1, \dots, \epsilon^n\}$: \begin{align} \phi = \sum_{i=1}^n \phi_i \epsilon^i \quad \text{and} \quad \psi = \sum_{j=1}^n \psi_j \epsilon^j \end{align} ($\phi_i, \psi_j \in \Bbb{R}$ being scalars). Then, (by the orthonormality) \begin{align} h(\phi, \psi) = \sum_{i=1}^n \phi_i \psi_i \end{align}

peek-a-boo
  • 55,725
  • 2
  • 45
  • 89
  • Excellent explanation. You are denoting the space where inner appears as a field $F$, not by $R$. – Tursinbay Dec 24 '19 at 19:54
  • @Tursinbay oh yes I should probably change the field to $\Bbb{R}$ because over general fields, inner products do not make sense (and also, since if we work over $\Bbb{C}$, then $g^{\flat}$ wouldn't be an isomorphism anymore so I would have to re-word a lot of stuff) – peek-a-boo Dec 24 '19 at 19:59
0

The dual space,V*, to a given vector space, V, is the set of all functions from V to its field of scalars (typically the real numbers or complex numbers). That set becomes a vector space itself defining addition and scalar multiplication by (f+ g)(v)= f(v)+ g(v) and (af)(v)= a(f(v)). If V is n-dimensional then V* is also n-dimensional. Given the basis {v1, v2, …, vn} for V, the set of functions {f_1, f_2, …., fn} where fi is defined by fi(vi)= 1, fi(vj)= 0 for j not equal to I and extended to all vectors "by linearity": f(v)= f(a1fv1+ a2v2+ …+ anvn)= a1f(v1)+ a2f(v2)+ ….+ anf(vn). So given a vector, v, in V, this associates a unique function, v*, in V* by writing v as a linear combination of the basis vectors of V, then defining v* to be the linear combination of the corresponding basis of V*, with the same scalar coefficients.

Once we have a dual space, we can define a "dot product", u.v, in V, by taking the function, u*, associated to the vector, u, and applying it to v, u*(v).

Similarly, given two functions, u* and v*, in V*, we can define a "dot product", u*.v*, in V*, by taking the vector, v, associated to the function, v*, and applying u* to it, u*(v).

user247327
  • 18,710