Free Online Probability Distributions Quiz

This Post is about the Online Probability Distributions Quiz and covers topics related to the Mean and Variance of random variables and the distribution of Random variables. MCQs Probability Random variable quiz requires knowledge of events, experiments, mutually exclusive events, collectively exhaustive events, sure events, impossible events, addition and multiplication laws of probability, concepts related to discrete and continuous random variables, probability distribution and probability density functions, characteristics and properties of probability distributions, discrete probability distribution, and continuous probability distributions, etc. To start with Online Probability Distributions Quiz click the links below.

Online Probability Distributions Quiz

Probability Distribution Quiz 08MCQs Probability Distributions 07
MCQs Probability Distributions 06MCQs Probability Distributions 05MCQs Probability Distributions 04
MCQs Probability Distributions 03MCQs Probability Distributions 02MCQs Probability Distributions 01

Introduction

Probability distributions are the foundation of understanding how likely different outcomes are in random events. Probability distributions describe the various possibilities (values) a random variable can take on and the associated probabilities of each possibility occurring.

There are two main categories of probability distributions:

Online Probability Distributions Quiz

Uses of Probability Distributions

Probability distributions are widely used in various fields, including:

  • Statistics: Form the foundation for statistical analysis and inference.
  • Finance: Used to model stock prices, investment returns, and risk analysis.
  • Machine Learning: Play a crucial role in algorithms for classification, prediction, and anomaly detection.
  • Engineering: Applied in reliability analysis, quality control, and signal processing.
  • Many other scientific disciplines: Used to model natural phenomena, analyze experimental data, and assess uncertainties.
Probability

Therefore, by understanding the concepts of probability distributions, we can

  • Calculate probabilities of specific events: Given a distribution (discrete or continuous), one can calculate the probability of a certain outcome or a range of outcomes occurring.
  • Make predictions about future events: By analyzing past data and fitting it to a probability distribution, one can make predictions about the likelihood of similar events happening in the future.
  • Compare outcomes from different scenarios: One can compare the probabilities of events associated with different choices or conditions.

By understanding probability distributions, you gain a powerful tool to analyze randomness, quantify uncertainty, and make informed decisions under uncertainty.

R Programming for Data Analysis

Binomial Distribution (2016)

In this post, we will learn about Binomial Distribution and its basics.

A statistical experiment having successive independent trials having two possible outcomes (such as success and failure; true and false; yes and no; right and wrong etc.) and probability of success is equal for each trial, while this kind of experiment is repeated a fixed number of times (say $n$ times) is called Binomial Experiment, Each trial of this Binomial experiment is known as Bernoulli trial (a trial which is a single performance of an experiment), for example.

Properties of the Binomial Experiment

  1. Each trial of the Binomial Experiment can be classified as a success or failure.
  2. The probability of success for each trial of the experiment is equal.
  3. Successive trials are independent, that is, the occurrence of one outcome in an experiment does not affect the occurrence of the other.
  4. The experiment is repeated a fixed number of times.

Binomial Distribution

Let $X$ be a discrete random variable, which denotes the number of successes of a Binomial Experiment (we call this binomial random variable). The random variable assumes isolated values as $X=0,1,2,\cdots,n$. The probability distribution of the binomial random variables is termed binomial distribution. It is a discrete probability distribution.

Binomial Probability Mass Function

The probability function of the binomial distribution is also called the binomial probability mass function. It can be denoted by $b(x, n, p)$, that is, a binomial distribution of random variable $X$ with $n$ (given number of trials) and $p$ (probability of success) as parameters. If $p$ is the probability of success (alternatively $q=1-p$ is probability of failure such that $p+q=1$) then probability of exactly $x$ success can be found from the following formula,

\begin{align}
b(x, n, p) &= P(X=x)\\
&=\binom{n}{x} p^x q^{n-x}, \quad x=0,1,2, \cdots, n
\end{align}

where $p$ is the probability of success of a single trial, $q$ is the probability of failure and $n$ is the number of independent trials.

The formula gives the probability for each possible combination of $n$ and $p$ of a binomial random variable $X$. Note that it does not give $P(X <0)$ and $P(X>n)$. The binomial distribution is suitable when $n$ is small and applied when sampling is done with replacement.

\[b(x, n, p) = \binom{n}{x} p^x q^{n-x}, \quad x=0,1,2,\cdots,n,\]

is called Binomial distribution because its successive terms are the same as that of binomial expansion of

Binomial Distribution

\begin{align}
(q+p)^n=\binom{0}{0} p^0 q^{n-0}+\binom{n}{1} p^1 q^{n-1}+\cdots+\binom{n}{n-1} p^n q^{n-(n-1)}+\binom{n}{n} p^n q^{n-n}
\end{align}

$\binom{n}{0}, \binom{n}{1}, \binom{n}{2},\cdots, \binom{n}{n-1}, \binom{n}{n}$ are called Binomial coefficients.

Note that it is necessary to describe the limit of the random variable otherwise, it will be only the mathematical equation, not the probability distribution.

https://itfeature.com statistics help

Take Online MCQ tests on Probability Distributions

Online MCQs Quiz Website

Generate Binomial Random Numbers in R Language

FAQs about Binomial Distribution

  1. What is a binomial random variable?
  2. What is a binomial experiment?
  3. What is the binomial formula?
  4. What is the binomial probability mass function?
  5. Discuss the properties of the Binomial experiment.
  6. What are the parameters of binomial distribution?

Non Central Chi Square Distribution (2013)

The Non Central Chi Square Distribution is a generalization of the Chi-Square Distribution.
If $Y_{1} ,Y_{2} ,\cdots ,Y_{n} \sim N(0,1)$ i.e. $(Y_{i} \sim N(0,1)) \Rightarrow y_{i}^{2} \sim \psi _{i}^{2}$ and $\sum y_{i}^{2}  \sim \psi _{(n)}^{2} $

If mean ($\mu $) is non-zero then $y_{i} \sim N(\mu _{i} ,1)$ i.e each $y_{i} $ has different mean
\begin{align*}
\Rightarrow  & \qquad y_i^2 \sim \psi_{1,\frac{\mu_i^2}{2}} \\
\Rightarrow  & \qquad \sum y_i^2 \sim \psi_{(n,\frac{\sum \mu_i^2}{2})} =\psi_{(n,\lambda )}^{2}
\end{align*}

Note that if $\lambda =0$ then we have central $\psi ^{2} $. If $\lambda \ne 0$ then it is a noncentral chi-squared distribution because it has no central mean (as distribution is not standard normal).

Central Chi Square Distribution $f(x)=\frac{1}{2^{\frac{n}{2}} \left|\! {\overline{\frac{n}{2} }}  \right. } \chi ^{\frac{n}{2} -1} e^{-\frac{x}{2} }; \qquad 0<x<\infty $

Theorem:

If $Y_{1} ,Y_{2} ,\cdots ,Y_{n} $ are independent normal random variables with $E(y_{i} )=\mu _{i} $ and $V(y_{i} )=1$ then $w=\sum y_{i}^{2}  $ is distributed as non central chi-square with $n$ degree of freedom and non-central parameter $\lambda $, where $\lambda =\frac{\sum \mu _{i}^{2}  }{2} $ and has pdf

\begin{align*}
f(w)=e^{-\lambda } \sum _{i=0}^{\infty }\left[\frac{\lambda ^{i} w^{\frac{n+2i}{2} -1} e^{-\frac{w}{2} } }{i!\, 2^{\frac{n+2i}{2} } \left|\! {\overline{\frac{n+2i}{2} }}  \right. } \right]\qquad 0\le w\le \infty
\end{align*}

Proof: Non Central Chi Square Distribution

Consider the moment generating function of $w=\sum y_{i}^{2}  $

\begin{align*}
M_{w} (t)=E(e^{wt} )=E(e^{t\sum y_{i}^{2}  } ); \qquad \text{ where } y_{i} \sim N(\mu \_{i} ,1)
\end{align*}

By definition
\begin{align*}
M_{w} (t) &= \int \cdots \int e^{t\sum y_{i}^{2} } .f(y_{i} )dy_{i} \\
&= K_{1} \int \cdots \int e^{-\frac{1}{2} (1-2t)\left[\sum y_{i}^{2} -\frac{2\sum y_{i} \mu _{i} }{1-2t} \right]}   dy_{1} .dy_{2} \cdots dy_{n} \\
&\text{By completing square}\\
& =K_{1} \int \cdots \int e^{\frac{1}{2} (1-2t)\sum \left[\left[y_{i} -\frac{\mu _{i} }{1-2t} \right]^{2} -\frac{\mu _{i}^{2} }{(1-2t)^{2} } \right]}   dy_{1} .dy_{2} \cdots dy_{n} \\
&= e^{-\frac{\sum \mu_{i}^{2} }{2} \left(1-\frac{1}{1-2t} \right)} \int \cdots \int \left(\frac{1}{\sqrt{2\pi } } \right)^{n} \frac{\frac{1}{\left(\sqrt{1-2t} \right)^{n} } }{\frac{1}{\left(\sqrt{1-2t} \right)^{n} } }  \, e^{-\frac{1}{2.\frac{1}{1-2t} } .\sum \left(y_{i} -\frac{\mu _{i} }{1-2t} \right)^{2} }  dy_{1} .dy_{2} \cdots dy_{n}\\
&=e^{-\frac{\sum \mu _{i}^{2} }{2} \left(1-\frac{1}{1-2t} \right)} .\frac{1}{\left(\sqrt{1-2t} \right)^{n} } \int \cdots \int \left(\frac{1}{\sqrt{2\pi } } \right)^{n}  \frac{1}{\left(\sqrt{\frac{1} {1-2t}} \right)^n} e^{-\, \frac{1}{2.\frac{1}{1-2t} } .\sum \left(y_{i} -\frac{\mu_i}{1-2t}\right)^{2} } dy_{1} .dy_{2} \cdots dy_{n}\\
\end{align*}

where

\[\int_{-\infty}^{\infty } \cdots \int _{-\infty }^{\infty }\left(\frac{1}{\sqrt{2\pi}} \right)^{n} \frac{1}{\left(\frac{1}{1-2t} \right)^{\frac{n}{2}}} e^{-{\frac{1}{2}.\frac{1}{1-2t} }} .\sum \left(y_{i} -\frac{\mu _{i} }{1-2t} \right)^{2} dy_{1} .dy_{2} \cdots dy_{n}\]
is integral to complete density

\begin{align*}
M_{w}(t)&=e^{-\frac{\sum \mu_i^2}{2} \left(1-\frac{1}{1-2t}\right)} .\left(\frac{1}{\sqrt{1-2t} } \right)^{n} \\
&=\left(\frac{1}{\sqrt{1-2t}}\right)^{n} e^{-\lambda \left(1-\frac{1}{1-2t} \right)} \\
&=e^{-\lambda }.e^{\frac{\lambda}{1-2t}} \frac{1}{(1-2t)^{\frac{n}{2}}}\\
&\text{Using Taylor series about zero}\\
&=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} }{i!(1-2t)^{i} (1-2t)^{n/2} }\\
M_{w=y_{i}^{2} } (t)&=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} }{i!(1-2t)^{\frac{n+2i}{2} } }\tag{A}
\end{align*}

Now Moment Generating Function (MGF) for non central Chi Square distribution for a given density function is
\begin{align*}
M_{\omega} (t) & = E(e^{\omega t} )\\
&=\int _{0}^{\infty }e^{\omega \lambda } e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} \omega ^{\frac{n+2i}{2} -1} e^{-\frac{\omega }{2} } }{i!2^{\frac{n+2i}{2} } \left|\! {\overline{\frac{n+2i}{2} }}  \right. } d\omega\\
&=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} }{i!2^{\frac{n+2i}{2} } \left|\! {\overline{\frac{n+2i}{2} }}  \right. }  \int _{0}^{\infty }e^{\frac{\omega }{2} (1-2t)}  \omega ^{\frac{n+2i}{2} -1} d\omega
\end{align*}
Let
\begin{align*}
\frac{\omega }{2} (1-2t)&=P\\
\Rightarrow \omega & =\frac{2P}{1-2t} \\
\Rightarrow d\omega &=\frac{2dp}{1-2t}
\end{align*}

\begin{align*}
&=e^{-\lambda } \sum\limits_{i=0}^{\infty }\frac{\lambda ^{i} }{i!2^{\frac{n+2i}{2} } \left|\! {\overline{\frac{n+2i}{2} }}  \right. }  \int _{0}^{\infty }e^{-P} \left(\frac{2P}{1-2t} \right)^{\frac{n+2i}{2} -1} \frac{2dP}{1-2t}  \\
&=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} 2^{\frac{n+2i}{2} } }{i!2^{\frac{n+2i}{2} } \left|\! {\overline{\frac{n+2i}{2} }}  \right. (1-2t)^{\frac{n+2i}{2} -1} } \int _{0}^{\infty }e^{-P} P^{\frac{n+2i}{2} -1}  dP \\
&=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} }{i!\left|\! {\overline{\frac{n+2i}{2} }}  \right. (1-2t)^{\frac{n+2i}{2} } } \left|\! {\overline{\frac{n+2i}{2} }}  \right.
\end{align*}

as \[\int\limits _{0}^{\infty }e^{-P} P^{\frac{n+2i}{2} -1}  dP=\left|\! {\overline{\frac{n+2i}{2} }}  \right. \]

\[M_{\omega } (t)=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} }{i!(1-2t)^{\frac{n+2i}{2} } }  \tag{B}\]

Comparing ($A$) and ($B$)
\[M_{w=\sum y_{i}^{2} } (t)=M_{\omega } (t)\]

Non Central Chi Square Distribution

By Uniqueness theorem

\[f_{w} (w)=f_{\omega } (\omega )\]
\begin{align*}
\Rightarrow \qquad f_{w} (t)&=f(\psi ^{2} )\\
&=e^{-\lambda } \sum _{i=0}^{\infty }\frac{\lambda ^{i} w^{\frac{n+2i}{2} -1} e^{-\frac{w}{2} } }{i!2^{\frac{n+2i}{2} } \left|\! {\overline{\frac{n+2i}{2} }}  \right. };  \qquad o\le w\le \infty
\end{align*}
is the pdf of non central chi square with $n$ degrees of freedom and $\lambda =\frac{\sum \mu _{i}^{2} }{2} $ is the non-centrality parameter. Non Central Chi Square distribution is also Additive to Central Chi Square distribution.

Application of Non Central Chi Square Distribution

  • Power analysis: Non Central Chi Square Distribution is useful in calculating the power of chi-squared tests.
  • Non-normal data: When the underlying data is not normally distributed, the non central chi squared distribution can be used in certain tests that rely on chi-squared approximations.
  • Signal processing: In some areas like radar systems, the non central chi squared distribution arises when modeling signals with background noise.
https://itfeature.com

Reference: