# Unequal probability inverse sampling Section 4. Simple random sampling without replacementUnequal probability inverse sampling Section 4. Simple random sampling without replacement

For the case without replacement, the notation used is the same as for the draw with replacement. The number of failures ${X}_{i}$ therefore has a negative hypergeometric distribution. This probability distribution is little known, to the point that it has been presented as a “forgotten” distribution by Miller and Fridell (2007). This distribution is the counterpart to the negative binomial for the draw without replacement. The general framework is as follows: We consider a population of size $M$ in which there are $M{p}_{i}$ favourable units, namely the occupations in the list that exist in the enterprise. If the draws are equal probability without replacement until $r$ favorable units appear, then the negative hypergeometric variable, ${X}_{i}\sim NH\left(M,r,M{p}_{i}\right),$ counts the number of failures before $r$ favourable events occur.

The probability distribution is

$\mathrm{Pr}\left({X}_{i}=x\right)=p\left(x;M,r,M{p}_{i}\right)=\frac{\left(\begin{array}{c}x+r-1\\ x\end{array}\right)\left(\begin{array}{c}M-x-r\\ M{p}_{i}-r\end{array}\right)}{\left(\begin{array}{c}M\\ M{p}_{i}\end{array}\right)},$

where $x\in \left\{0,\dots ,M\left(1-{p}_{i}\right)\right\},$ $M\in \left\{1,2,\dots \right\},$ $M{p}_{i}\in \left\{1,2,\dots ,M\right\},$ and $r\in \left\{1,2,\dots ,M{p}_{i}\right\}.$

$\text{E}\left({X}_{i}\right)=\frac{Mr\left(1-{p}_{i}\right)}{M{p}_{i}+1},\text{var}\left({X}_{i}\right)=\frac{rM\left(1-{p}_{i}\right)\left(M+1\right)\left(M{p}_{i}-r+1\right)}{{\left(M{p}_{i}+1\right)}^{2}\left(M{p}_{i}+2\right)}.$

Again, ${A}_{ik}$ denotes the number of times that unit $k$ is selected in the sample. Now, the value of ${A}_{ik}$ can be only 0 or 1. If $n$ units are selected using a simple design without replacement in $L,$ the sample design is defined as

$\mathrm{Pr}\left({A}_{ik}={a}_{ik},k\in L\right)={\left(\begin{array}{c}M\\ n\end{array}\right)}^{-1},$

where ${a}_{ik}\in \left\{0,1\right\},$ and

$\sum _{k\in L}\text{\hspace{0.17em}}{a}_{ik}=n.$

If the vector of ${A}_{ik}$ is conditioned on a fixed size in one part of the population, we have

$\begin{array}{ll}\mathrm{Pr}\left({A}_{ik}={a}_{ik},k\in {F}_{i}|\text{\hspace{0.17em}}\sum _{k\in {F}_{i}}{A}_{ik}=r\right)\hfill & =\frac{\mathrm{Pr}\left({A}_{ik}={a}_{ik},k\in {F}_{i}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{and}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\sum _{k\in {F}_{i}}\text{\hspace{0.17em}}{A}_{ik}=r\right)}{\mathrm{Pr}\left(\sum _{k\in {F}_{i}}\text{\hspace{0.17em}}{A}_{ik}=r\right)}\hfill \\ \hfill & ={\left[\frac{\left(\begin{array}{c}M{p}_{i}\\ r\end{array}\right)\left(\begin{array}{c}M-M{p}_{i}\\ n-r\end{array}\right)}{\left(\begin{array}{c}M\\ n\end{array}\right)}\right]}^{-1}\sum _{\begin{array}{c}k\in {D}_{i}\\ {\sum }_{k\in \text{\hspace{0.17em}}{F}_{i}}{A}_{ik}=n-r\\ {A}_{ik}\in \left\{0,1\right\}\end{array}}\frac{1}{\left(\begin{array}{l}M\hfill \\ n\hfill \end{array}\right)}\hfill \\ \hfill & ={\left[\frac{\left(\begin{array}{c}M{p}_{i}\\ r\end{array}\right)\left(\begin{array}{c}M-M{p}_{i}\\ n-r\end{array}\right)}{\left(\begin{array}{c}M\\ n\end{array}\right)}\right]}^{-1}\frac{\left(\begin{array}{c}M-M{p}_{i}\\ n-r\end{array}\right)}{\left(\begin{array}{c}M\\ n\end{array}\right)}\hfill \\ \hfill & ={\left(\begin{array}{c}M{p}_{i}\\ r\end{array}\right)}^{-1},\hfill \end{array}$

with

$\sum _{k\in {F}_{i}}\text{\hspace{0.17em}}{a}_{ik}=r.$

This shows that, if the sum of ${A}_{ik}$ is conditioned on one part of the population, we still have a simple design without replacement. In the procedure in which we draw without replacement until we obtain $r$ occupations in enterprise $i,$ we therefore have

$\text{E}\left({A}_{ik}\text{\hspace{0.17em}}|\text{\hspace{0.17em}}{X}_{i}\right)=\left\{\begin{array}{ll}\frac{r}{M{p}_{i}}\hfill & \text{if}\text{\hspace{0.17em}}\text{\hspace{0.17em}}k\in {F}_{i}\hfill \\ \frac{{X}_{i}}{M-M{p}_{i}}\hfill & \text{if}\text{\hspace{0.17em}}\text{\hspace{0.17em}}k\in {D}_{i}.\hfill \end{array}$

The inclusion probability is therefore

${\pi }_{k\text{\hspace{0.17em}}|\text{\hspace{0.17em}}i}=\text{EE}\left({A}_{ik}\text{\hspace{0.17em}}|\text{\hspace{0.17em}}{X}_{i}\right)=\left\{\begin{array}{ll}\frac{r}{M{p}_{i}}\hfill & \text{if}\text{\hspace{0.17em}}\text{\hspace{0.17em}}k\in {F}_{i}\hfill \\ \frac{\text{E}\left({X}_{i}\right)}{M-M{p}_{i}}=\frac{r}{M{p}_{i}+1}\hfill & \text{if}\text{\hspace{0.17em}}\text{\hspace{0.17em}}k\in {D}_{i},\hfill \end{array}$

for all $k\in L.$ Again, the problem is that we know $M,r$ and ${X}_{i},$ but not ${p}_{i}.$ We can estimate ${p}_{i}$ using the maximum likelihood method, through a numerical method.

Using the method of moments, an estimate can be obtained by solving for ${p}_{i}$ in the equation ${X}_{i}=\text{E}\left({X}_{i}\right)$ , that is,

${X}_{i}=\frac{Mr\left(1-{\stackrel{^}{p}}_{i}\right)}{M{\stackrel{^}{p}}_{i}+1}.$

Hence

${\stackrel{^}{p}}_{i1}=\frac{Mr-{X}_{i}}{M\left(r+{X}_{i}\right)}.$

However, in a few lines it is verified that, if $r\ge 2,$

${\stackrel{^}{p}}_{i2}=\frac{r-1}{r+{X}_{i}-1}$

is unbiased for ${p}_{i}.$

Again, since we are using weights that are inverses of ${\pi }_{k\text{\hspace{0.17em}}|\text{\hspace{0.17em}}i}.$ The inverses of the inclusion probabilities are thus estimated as follows:

$\stackrel{^}{1/{\pi }_{k\text{\hspace{0.17em}}|\text{\hspace{0.17em}}i}}=\left\{\begin{array}{lll}\frac{M{\stackrel{^}{p}}_{i2}}{r}\hfill & =\frac{M\left(r-1\right)}{r\left({X}_{i}+r-1\right)}\hfill & \text{if}\text{\hspace{0.17em}}\text{\hspace{0.17em}}k\in {F}_{i}\hfill \\ \frac{M\left(1-{\stackrel{^}{p}}_{i2}\right)}{{X}_{i}}\hfill & =\frac{M}{{X}_{i}+r-1}\hfill & \text{if}\text{\hspace{0.17em}}\text{\hspace{0.17em}}k\in {D}_{i}.\hfill \end{array}$

These weights are also used in the estimator by Murthy (1957), which is unbiased (see also Salehi and Seber 2001). If $M{p}_{i} all occupations will be selected in enterprise $i$ and the estimated inclusion probabilities are then equal to 1.

Date modified: