4

Suppose that $X$ is a Multinomial($n, \textbf{p}$) r.v., where $\textbf{p}$ = $(p_1, . . . , p_k)$. That is, $X$ is a random vector in $\{0, 1, \ldots , n\}^k$.

Find its multivariate moment generating function $M_{X}$, defined by: $M_{\textbf{X}}(\textbf{t}) := \mathbb{E}[exp(\textbf{t}^T \textbf{X})] = \mathbb{E}[exp(\sum\limits_{i=1}^k t_{i} \textbf{X}_{i})]$

So far from the definition I have done:

$\mathbb{E}[exp(\sum\limits_{i=1}^k t_{i} \textbf{X}_{i})]$

=$\mathbb{E}[\prod\limits_{i=1}^n exp(\textbf{t}^T \textbf{X})] $

=$\prod\limits_{i=1}^n\mathbb{E}[exp(\sum\limits_{i=1}^k t_{i} \textbf{X}_{i})]$ (by independence)

I know this is in the form of a moment generating function so

=$\prod\limits_{i=1}^n M_{Xi}(t)$

Could you explain the next part and correct any prior mistakes, I feel like I'm missing something obvious.

Thank you.

Siong Thye Goh
  • 149,520
  • 20
  • 88
  • 149
J.banks
  • 151

2 Answers2

4

Let $S$ denote the set $\{ x \in \{ 0, \ldots, n\}^k : \sum_{i=1}^k x_i = n \}$ \begin{align} \mathbb{E}\left[ \exp \left( \sum_{i=1}^k t_iX_i \right) \right] &= \sum_{x \in S} \binom{n}{x_1 \ldots x_k} \prod_{i=1}^k p_i^{x_i} e^{t_ix_i}, \text{By definition of expectation} \\ &= \sum_{x \in S} \binom{n}{x_1 \ldots x_k} \prod_{i=1}^k \left(p_ie^{t_i} \right)^{x_i}, \text{Factorize $x_i$}\\ &=\left( \sum_{i=1}^kp_ie^{t_i}\right)^n, \text{By multinomial formula} \end{align}

Note that for multinomial distributions, we do not have independence.

Siong Thye Goh
  • 149,520
  • 20
  • 88
  • 149
  • 1
    Hard for me to digest this solution, fully. Can someone make it more obvious? – Guannan Shen Jan 28 '21 at 19:46
  • @GuannanShen take a look at my solution below when you have the chance. I just broke down Siong's solution into smaller components, so hopefully it helps somewhat. Let me know if something isn't clear or if something is incorrect. – Paul Ash Feb 10 '22 at 01:15
3

Suppose $X_1, X_2, \ldots, X_{k-1}$ has a multinomial distribution such that $x_1 + x_2 + \cdots + x_{k-1} \leq n \ $, where $x_1, x_2, \ldots, x_{k-1}$ are non-negative integers, and $n$ represents the number of independent times a random experiment is performed. Let $p_k = 1-p_1-p_2- \cdots-p_{k-1}$ and let $x_k = n-x_1- x_2 -\cdots-x_{k-1}.\ $

Then, the probability mass function of $X_1, X_2, \ldots, X_{k-1}$ is given by:

$$ p(x_1,x_2, \ldots, x_{k-1}) = \frac{n!}{x_1!x_2! \cdots x_{k-1}!x_k!} \ p_1^{x_1}p_2^{x_2} \cdots p_{k-1}^{x_{k-1}} \ p_k^{x_k} $$

The moment generating function of $X_1, X_2, \ldots, X_{k-1}$, denoted by $M(t_1, t_2, \ldots, t_{k-1}),$ is a special expectation where:

\begin{align} M(t_1, t_2, \ldots, t_{k-1}) =& \ \mathbf{E}(e^{t_1x_1+t_2x_2 \ + \ \cdots \ + \ t_{k-1}x_{k-1}}) \\ =& \sum_{x_1= \ 0}^{n} \ \sum_{x_2= \ 0}^{n-x_1}\cdots \sum_{x_{k-1}= \ 0}^{n-x_1-\cdots-x_{k-2}} \ e^{t_1x_1+t_2x_2 \ + \ \cdots \ + \ t_{k-1}x_{k-1}} \ p(x_1,x_2, \ldots, x_{k-1}) \\ =& \sum_{x_1= \ 0}^{n} \ \sum_{x_2= \ 0}^{n-x_1}\cdots \sum_{x_{k-1}= \ 0}^{n-x_1-\cdots-x_{k-2}} \ \frac{n!}{x_1!x_2! \cdots x_{k-1}!x_k!} \ (p_1e^{t_1})^{x_1}(p_2e^{t_2})^{x_2}\cdots (p_{k-1}e^{t_{k-1}})^{x_{k-1}}p_k^{x_k} \ \\ \end{align}

Since $n$ is a positive integer and $p_1e^{t_1}, p_2e^{t_2},\ldots, p_k$ are fixed constants (by construction of defining the multinomial distribution), the above sum can be simplified to:

$$ (p_1e^{t1}+p_2e^{t2}+\cdots+p_{k-1}e^{t_{k-1}}+p_k)^n $$

which is $M(t_1,t_2, \ldots, t_{k-1})$.

Paul Ash
  • 1,129