Multiple Choice Questions from Statistical Inference for the preparation of exam and different statistical job tests in Government/ Semi-Government or Private Organization sectors. These tests are also helpful in getting admission in different colleges and Universities.

Most of the MCQs on this page are covered from Estimate and Estimation, Testing of Hypothesis, Parametric and Non-Parametric tests etc.

**Question 1:** If $Var(\hat{\theta})\rightarrow 0$ as $n \rightarrow 0$, then $\hat{\theta}$ is said to be

A) Unbiased

B) Sufficient

C) Efficient

D) Consistent

**Question 2:** If $E(\hat{\theta})=\theta$, then $\hat{\theta}$ is said to be

A) Unbiased

B) Sufficient

C) Efficient

D) Consistent

**Question 3:** If $Var(T_2) < Var(T_1)$, then $T_2$ is

A) Unbiased

B) Efficient

C) Sufficient

D) Consistent

**Question 4:** If $f(x_1,x_2,\cdots,x_n;\theta)=g(\hat{\theta};\theta)h(x_1,x_2,\cdots,x_n)$, then $\hat{\theta}$ is

A) Unbiased

B) Efficient

C) Sufficient

D) Consistent

**Question 5:** Which of the following assumptions are required to show the consistency, unbiasedness and efficiency of the OLS estimator?

i) $E(\mu_t)=0$

ii) $Var(\mu_t)=\sigma^2$

iii) $Cov(\mu_t,\mu_{t-j})=0;t\neq t-j$

iv) $\mu_t \sim N(0,\sigma^2)$

A) (ii) and (iv) only

B) (i) and (iii) only

C) (i), (ii) and (iii) only

D) (i), (ii), (iii) and (iv) only

**Question 6:** For a biased estimator $\hat{\theta}$ of $\theta$, which one is correct

A) $MSE(\hat{\theta})=SD(\hat{\theta}) + Bias$

B) $MSE(\hat{\theta})=Var(\hat{\theta}) + Bias^2$

C) $MSE(\hat{\theta})=Var(\hat{\theta}) + Bias$

D) $MSE(\hat{\theta})=SD(\hat{\theta}) + Bias^2$

**Question 7:** A test is said to be most powerful test of size $\alpha$, if

A) Among all other test of size $\alpha$ or greater it has the largest $\beta$

B) Among all other test of size $\alpha$ or less it has the largest power

C) Among all other test size a or greater it has the larger 1-$\alpha$

D) Among all other test of size $\alpha$ or greater it has the smallest power

**Question 8:** In statistical inference, the best asymptotically normal estimator is denote by

A) BAN

B) CANE

C) BANE

D) A) and B)

E) None of these

**Question 9:** If the conditional distribution of $X_1, X_2,\cdots,X_n$ given $S=s$, does not depend on $\theta$, for any value of $S=s$, the statistics $S=s(X_1,X_2,\cdots,X_n)$ is called

A) Unbiased

B) Consistent

C) Sufficient

D) Efficient

**Question 10:** A set of jointly sufficient statistics is defined to be minimal sufficient if and only if

A) It is a function of every other set of sufficient statistics

B) It is not a function of every other set of sufficient statistics

C) It is a function of some other set of sufficient statistics

D) It is a function of any sufficient statistics in the set

**Question 11:** If $X_1,X_2,\cdots, X_n$ is the joint density of n random variables, say, $f(X_1, X_2,\cdots, X_n;\theta)$ which is considered to be a function of $\theta$. Then $L(\theta; X_1,X_2,\cdots, X_n)$ is called

A) Maximum Likelihood function

B) Likelihood Function

C) Log Function

D) Marginal Function

**Question 12:** For two estimators $T_1=t_1(X_1,X_2,\cdots,X_n)$ and $T_2=t_2(X_1,X_2,\cdots,X_n)$ then estimator $t_1$ is defined to be $R_{{t_1}(\theta)}\leq R_{{t_2}(\theta)}$ for all $\theta$ in $\Theta$

A) Admissible Estimator

B) Sufficient Estimator

C) Consistent Estimator

D) Minimax Estimator

**Question 13:** Let $X_1,X_2,\cdots,X_n$ be a random sample from the density $f(x;\theta)$, where $\theta$ may be vector. If the conditional distribution of $X_1,X_2,\cdots,X_n$ given $S=s$ does not depend on $\theta$ for any value of s of S, then statistic is called.

A) Minimax Statistics

B) Efficient

C) Sufficient Statistic

D) Minimal Sufficient Statistic

**Question 14:** $Var_\theta (T) \geq \frac{[\tau'(\theta)]^2}{nE[{\frac{\partial}{\partial \theta}log f((X;\theta)}^2]}$, where $T=t(X_1,X_2,\cdots, X_n)$ is an unbiased estimator of $\tau(\theta)$. Then above inequality is called

A) Cauchy Schawarz Inequality

B) Bool’s Inequality

C) Chebyshev’s Inequality

D) Cramer Rao Inequality

**Question 15:** Let $X_1,X_2,\cdots,X_n$ be a random sample from a density $f(x|\theta)$, where $\theta$ is a value of the random variable $\Theta$ with known density $g_\Theta(\theta)$. Then the estimator $\tau(\theta)$ with respect to the prior $g_\Theta(\theta)$ is defined as $E[\tau(\theta)|X_1,X_2,\cdots,X_n]$ is called

A) Minimax Estimator

B) Posterior Bay’s Estimator

C) Bay’s Estimator

D) Sufficient Estimator

**Question 16:** Let $L(\theta;X_1,X_2,\cdots,X_n)$ be the likelihood function for a sample $X_1,X_2,\cdots, X_n$ having joint density $f(x_1,x_2,\cdots,x_n;\theta)$ where ? belong to parameter space. Then a test defined as $\lambda=\lambda_n=\lambda(x_1,x_2,\cdots,x_n)=\frac{Sup_{\theta\varepsilon \Theta_0}L(\theta;x_1,x_2,\cdots,x_n)}{Sup_{\theta\varepsilon \Theta}L(\theta;x_1,x_2,\cdots,x_n)}$

A) Generalized Likelihood Ratio test

B) Most Powerful Uniformly Test

C) Monotone Likelihood Ratio Test

D) Unbiased Test

**Question 17:** Let $Z_1,Z_2,\cdots,Z_n$ be independently and identically distributed random variables, satisfying $E[|Z_t|]<\infty$. Let N be an integer valued random variable whose value n depends only on the values of the first n $Z_i$’s. Suppose $E(N)<\infty$, then $E(Z_1+Z_2+\cdots+Z_n)=E( N)E(Z_i)$ is called

A) Independence Equation

B) Neyman Pearson Lemma

C) Sequential Probability Likelihood Equation

D) Wald’s Equation

**Download Hypothesis Testing MCQs pdf file with answers**

**Hypothesis Testing MCQS 250.10 KB**

**Online Hypothesis Testing MCQs test**

## MCQs Hypotheses testing 3

*MCQs Hypotheses testing 3*. You scored %%SCORE%% out of %%TOTAL%%. Your performance has been rated as %%RATING%% Keep visiting http://itfeature.come