The question is: $x_i = \alpha + \omega_i, $ for $i = 1, \ldots, n.$
where $\alpha$ is a non-zero constant, but unknown, parameter to be estimated, and $\omega_i$ are uncorrelated, zero_mean, Gaussian random variable with known variance $\sigma_i^2$. Note that $\sigma_i^2$ and $\sigma_j^2$, for $i \neq j$, may be distinct. We wish to estimate $\alpha$ from a weighted sum of $x_i$, i.e.
$$\hat{\alpha} = \sum^n_{i=1}b_ix_i$$
Determine $b_i$, $i= 1, \ldots, n$, such that $\hat{\alpha}$ is unbiased and the variance of $\hat{\alpha}$ is as small as possible.
I have tried to use the unbiased condition and get that: $\sum_{i=1}^nb_i = 1$
I don't know how to use the variance of $\hat{\alpha}$ is as small as possible condition.