1

I am trying to understand parametric estimation theory more in depth. Now, the classic Capture-Recapture method assumes a population of size $N$, of which $R\leq N$ individuals are captured, marked and released. A second catch of, say, $m\leq R$ may contain $k$ marked individuals. Since the ration of marked to unmarked individuals should be reflected in the second catch, it is reasonable to estimate the unknown $N$ with the estimator $$\hat N(X)=\frac{Rm}{X},$$ where $X$ is a hypergeometric random variable.

However, the probability that $X(\omega)=0$ is not zero, and thus the expectation of $\hat N$ does not exist, since then $N$ would have to be estimated as infinite.

Still, i find that this method is described everywhere as suitable tool. I understand that in practice one can just continue to catch more individuals until one finds at least one or one just assumes small populations to minimize the $k=0$ issue, but that does not fit with the math. In particular the moment method and MLE both fail.

Are there any ways to circumvent this issue?

I've seen other methods like assuming (with the same conditions on $X$), that $$\hat N(X)=\frac{c}{X+1}+d.$$ But I do not understand where this comes from.

Another approach would be to use the negative hypergeometric distribution, where one draws samples without replacement from the population until the first marked individual appears. But I seem to be unable to find anything about it in the literature. Any hints are welcome.

EDIT: It is possible to have unbiased estimator with $Y$ negativ hypergeometric. The estimator for $N$ is then $$\hat N=\frac{R+1}{k}\,Y-1$$ where $k$ is the number of marked individuals we want to catch. Increasing $k$ also decreases the variance.

0 Answers0