Your probability space is incomplete, so the entroy is not well-defined.
The minimum entropy is achieved if the remaining $\frac{1}{2^n}$ of the time, it takes a single value, which we'll just assume is $X=0$. In that case (using $\log$ base $2$:)
$$\begin{align} H(X) &= - \sum_{k=0}^n P(X=k)\log P(X=k) \\
&=-\frac{1}{2^n}\log \frac{1}{2^n} - \sum_{k=1}^n \frac{1}{2^k}\log \frac{1}{2^k}\\
&=\frac{n}{2^n} + \sum_{k=1}^n \frac{k}{2^k}
\end{align}$$
This is the absolute minimum for the entropy. There is no upper bound for the entropy.
Using $$\sum_{k=1}^n kx^k = x\frac {nx^{n+1}-(n+1)x^n +1}{(x-1)^2}$$ with $x=\frac 1 2$, we can compute $\sum_{k=1}^n \frac{k}{2^k}$ which I think yields a minimum entropy of:
$$2-\frac{1}{2^{n-1}}$$
But I don't entirely trust my calculation here.
Perhaps the commenter above is right, and the real intent of the problem is that the probability is over all natural numbers, $1,...,n,...$ with $P(X=n)=\frac{1}{2^n}$. Then the entropy is the limit as $n\to\infty$ of the above lower bound, which is $2$ (assuming my calculation is correct.)