Best Online Estimation MCQs 1

Online Estimation MCQs for Preparation of PPSC and FPSC Statistics Lecturer Post. There are 20 multiple-choice questions covering the topics related to properties of a good estimation (unbiasedness, efficiency, sufficiency, consistency, and invariance), expectation, point estimate, and interval estimate. Let us start with the Online Estimation MCQs Quiz.

Online MCQs about Estimate and Estimation for Preparation of PPSC and FPSC Statistics Lecturer Post

1. Let $X_1,X_2,\cdots,X_n$ be a random sample from a density $f(x|\theta)$, where $\theta$ is a value of the random variable $\Theta$ with known density $g_\Theta(\theta)$. Then the estimator $\tau(\theta)$ with respect to the prior $g_\Theta(\theta)$ is defined as $E[\tau(\theta)|X_1,X_2,\cdots,X_n]$ is called

 
 
 
 

2. $Var_\theta (T) \geq \frac{[\tau'(\theta)]^2}{nE[{\frac{\partial}{\partial \theta}log f((X;\theta)}^2]}$, where $T=t(X_1,X_2,\cdots, X_n)$ is an unbiased estimator of $\tau(\theta)$. The above inequality is called

 
 
 
 

3. If $E(\hat{\theta})=\theta$, then $\hat{\theta}$ is said to be

 
 
 
 

4. For a biased estimator $\hat{\theta}$ of $\theta$, which one is correct

 
 
 
 

5. Let $Z_1,Z_2,\cdots,Z_n$ be independently and identically distributed random variables, satisfying $E[|Z_t|]<\infty$. Let N be an integer-valued random variable whose value n depends only on the values of the first n $Z_i$s. Suppose $E(N)<\infty$, then $E(Z_1+Z_2+\cdots+Z_n)=E( N)E(Z_i)$ is called

 
 
 
 

6. A set of jointly sufficient statistics is defined to be minimal sufficient if and only if

 
 
 
 

7. If $f(x_1,x_2,\cdots,x_n;\theta)=g(\hat{\theta};\theta)h(x_1,x_2,\cdots,x_n)$, then $\hat{\theta}$ is

 
 
 
 

8. A test is said to be the most powerful test of size $\alpha$, if

 
 
 
 

9. Which of the following assumptions are required to show the consistency, unbiasedness, and efficiency of the OLS estimator?

  1. $E(\mu_t)=0$
  2. $Var(\mu_t)=\sigma^2$
  3. $Cov(\mu_t,\mu_{t-j})=0;t\neq t-j$
  4. $\mu_t \sim N(0,\sigma^2)$
 
 
 
 

10. Which of the following statements describes an interval estimate?

 
 
 
 

11. If $Var(\hat{\theta})\rightarrow 0$ as $n \rightarrow 0$, then $\hat{\theta}$ is said to be

 
 
 
 

12. What are the main components of a confidence interval?

 
 
 
 

13. In statistical inference, the best asymptotically normal estimator is denoted by

 
 
 
 

14. If $Var(T_2) < Var(T_1)$, then $T_2$ is

 
 
 
 

15. Let $X_1,X_2,\cdots,X_n$ be a random sample from the density $f(x;\theta)$, where $\theta$ may be vector. If the conditional distribution of $X_1,X_2,\cdots,X_n$ given $S=s$ does not depend on $\theta$ for any value of $s$ of $S$, then statistic is called.

 
 
 
 

16. What is the maximum expected difference between a population parameter and a sample estimate?

 
 
 
 

17. If the conditional distribution of $X_1, X_2,\cdots,X_n$ given $S=s$, does not depend on $\theta$, for any value of $S=s$, the statistics $S=s(X_1,X_2,\cdots,X_n)$ is called

 
 
 
 

18. Let $L(\theta;X_1,X_2,\cdots,X_n)$ be the likelihood function for a sample $X_1,X_2,\cdots, X_n$ having joint density $f(x_1,x_2,\cdots,x_n;\theta)$ where ? belong to parameter space. Then a test defined as $\lambda=\lambda_n=\lambda(x_1,x_2,\cdots,x_n)=\frac{Sup_{\theta\varepsilon \Theta_0}L(\theta;x_1,x_2,\cdots,x_n)}{Sup_{\theta\varepsilon \Theta}L(\theta;x_1,x_2,\cdots,x_n)}$

 
 
 
 

19. For two estimators $T_1=t_1(X_1,X_2,\cdots,X_n)$ and $T_2=t_2(X_1,X_2,\cdots,X_n)$ then estimator $t_1$ is defined to be $R_{{t_1}(\theta)}\leq R_{{t_2}(\theta)}$ for all $\theta$ in $\Theta$

 
 
 
 

20. If $X_1,X_2,\cdots, X_n$ is the joint density of n random variables, say, $f(X_1, X_2,\cdots, X_n;\theta)$ which is considered to be a function of $\theta$. Then $L(\theta; X_1,X_2,\cdots, X_n)$ is called

 
 
 
 

Online Estimation MCQs with Answers

Online Estimation MCQs with Answers
  • If $Var(\hat{\theta})\rightarrow 0$ as $n \rightarrow 0$, then $\hat{\theta}$ is said to be
  • If $E(\hat{\theta})=\theta$, then $\hat{\theta}$ is said to be
  • If $Var(T_2) < Var(T_1)$, then $T_2$ is
  • If $f(x_1,x_2,\cdots,x_n;\theta)=g(\hat{\theta};\theta)h(x_1,x_2,\cdots,x_n)$, then $\hat{\theta}$ is
  • Which of the following assumptions are required to show the consistency, unbiasedness, and efficiency of the OLS estimator?
    i. $E(\mu_t)=0$
    ii. $Var(\mu_t)=\sigma^2$
    iii. $Cov(\mu_t,\mu_{t-j})=0;t\neq t-j$
    iv. $\mu_t \sim N(0,\sigma^2)$
  • For a biased estimator $\hat{\theta}$ of $\theta$, which one is correct
  • A test is said to be the most powerful test of size $\alpha$, if
  • In statistical inference, the best asymptotically normal estimator is denoted by
  • If the conditional distribution of $X_1, X_2,\cdots,X_n$ given $S=s$, does not depend on $\theta$, for any value of $S=s$, the statistics $S=s(X_1,X_2,\cdots,X_n)$ is called
  • A set of jointly sufficient statistics is defined to be minimal sufficient if and only if
  • If $X_1,X_2,\cdots, X_n$ is the joint density of n random variables, say, $f(X_1, X_2,\cdots, X_n;\theta)$ which is considered to be a function of $\theta$. Then $L(\theta; X_1,X_2,\cdots, X_n)$ is called
  • For two estimators $T_1=t_1(X_1,X_2,\cdots,X_n)$ and $T_2=t_2(X_1,X_2,\cdots,X_n)$ then estimator $t_1$ is defined to be $R_{{t_1}(\theta)}\leq R_{{t_2}(\theta)}$ for all $\theta$ in $\Theta$
  • Let $X_1,X_2,\cdots,X_n$ be a random sample from the density $f(x;\theta)$, where $\theta$ may be vector. If the conditional distribution of $X_1,X_2,\cdots,X_n$ given $S=s$ does not depend on $\theta$ for any value of $s$ of $S$, then statistic is called.
  • $Var_\theta (T) \geq \frac{[\tau'(\theta)]^2}{nE[{\frac{\partial}{\partial \theta}log f((X;\theta)}^2]}$, where $T=t(X_1,X_2,\cdots, X_n)$ is an unbiased estimator of $\tau(\theta)$. The above inequality is called
  • Let $X_1,X_2,\cdots,X_n$ be a random sample from a density $f(x|\theta)$, where $\theta$ is a value of the random variable $\Theta$ with known density $g_\Theta(\theta)$. Then the estimator $\tau(\theta)$ with respect to the prior $g_\Theta(\theta)$ is defined as $E[\tau(\theta)|X_1,X_2,\cdots,X_n]$ is called
  • Let $L(\theta;X_1,X_2,\cdots,X_n)$ be the likelihood function for a sample $X_1,X_2,\cdots, X_n$ having joint density $f(x_1,x_2,\cdots,x_n;\theta)$ where ? belong to parameter space. Then a test defined as $\lambda=\lambda_n=\lambda(x_1,x_2,\cdots,x_n)=\frac{Sup_{\theta\varepsilon \Theta_0}L(\theta;x_1,x_2,\cdots,x_n)}{Sup_{\theta\varepsilon \Theta}L(\theta;x_1,x_2,\cdots,x_n)}$
  • Let $Z_1,Z_2,\cdots,Z_n$ be independently and identically distributed random variables, satisfying $E[|Z_t|]<\infty$. Let N be an integer-valued random variable whose value $n$ depends only on the values of the first n $Z_i$s. Suppose $E(N)<\infty$, then $E(Z_1+Z_2+\cdots+Z_n)=E( N)E(Z_i)$ is called
  • What is the maximum expected difference between a population parameter and a sample estimate?
  • Which of the following statements describes an interval estimate?
  • What are the main components of a confidence interval?

R Frequently Asked Questions

Online MCQs Test Preparation Website

Leave a Comment

Discover more from Statistics for Data Analyst

Subscribe now to keep reading and get access to the full archive.

Continue reading