telmo-correa / all-of-statistics

Self-study on Larry Wasserman's "All of Statistics"
974 stars 278 forks source link

Exercise 10.13.7 #31

Open Pipe-Vash opened 1 year ago

Pipe-Vash commented 1 year ago

The solution presented from part a) to part c) is not general, it already assumes $X_1\sim Binomial(n_1,p_1)$ and $X_2\sim Binomial(n_2,p_2)$ only consist of 1 datapoint each, which we only know from part d).

Do not confuse the number of datapoints of $X_1$ and $X_2$ (let us call them $N_1$ and $N_2$) with the number of data contained in each $X_i$, which is given in the exercise as $n_i$ (i.e., $n_i$ is the number of $Bernoulli(p_i)$ repetitions).

The MLE of $p_i$ from $X_i \sim Binomial(n_i,p_i)$ consisting of $Ni$ datapoints comes from maximizing $$l{N_i}(pi)= \log\left(L{N_i}(p_i)\right)=log(pi) \sum{j=1}^{Ni} X{i,j} + log(1-pi) \sum{j=1}^{N_i} \left(ni-X{i,j}\right) +\sum_{j=1}^{N_i} log\binom{ni}{X{i,j}}$$

$$\begin{eqnarray} I(\hat{p_1},\hat{p_2})=\begin{bmatrix} \frac{\hat{p_1} \left(1- \hat{p_1}\right)}{N_1 n_1} & 0 \ 0 & \frac{\hat{p_2} \left(1- \hat{p_2}\right)}{N_2 n_2} \end{bmatrix} \end{eqnarray}$$