MCQs Probability Quiz Online 11

The post is about Probability Quiz Online. There are 20 multiple-choice questions covering topics related to events and types of events, laws of probability, dependent and independent events, sample space, and probabilities related to coins, dice, and standard deck of cards. Let us start with Probability Quiz Online.

Online Multiple Choice Questions about Probability

1. The probability of a sample space is equal to

 
 
 
 

2. The probability of the occurrence of the event ‘$A$’ is $P(A)=$

 
 
 
 

3. For two mutually exclusive events $A$ and $B$, $P(A) = 0.3$ and $P(B)=0.5$ then $P(A \cup B)$ is

 
 
 
 

4. If $A$ and $B$ are independent events then $P(A\cap B)=$

 
 
 
 

5. Baye’s Theorem

 
 
 
 

6. When an event is certain to occur, its probability is

 
 
 
 

7. The term “Sample Space$ is used for

 
 
 
 

8. If $A$ and $B$ are not mutually exclusive events then $P(A\cup B)=$

 
 
 
 

9. When the occurrence of an event does affect the probability of the occurrence of another event it is called

 
 
 
 

10. A fair dice is rolled twice, and the probability of getting a sum 8 is

 
 
 
 

11. If three coins are tossed, the all possible cases are

 
 
 
 

12. The probability of an event $A$ lies between

 
 
 
 

13. If $P(A \cap B) = \phi$ then $P(A \cup B)=$ ————-.

 
 
 
 

14. If $P(B|A)=0.30$ and $P(A \cap B)=0.12$ then $P(A)$ is

 
 
 
 

15. The probability of drawing a diamond card from a pack of 52 cards is

 
 
 
 

16. The probability of drawing a picture card from a pack of 52 cards is

 
 
 
 

17. If $A$ and $B$ are mutually exclusive events then $P(A \cup B)=$

 
 
 
 

18. If a fair dice is rolled, the sample space is

 
 
 
 

19. If $A$ and $B$ are dependent events then $P(A \cap B) =$

 
 
 
 

20. The probability of drawing a ball at random from the box is

 
 
 
 

MCQs Probability Quiz Online with Answers

MCQs Probability Quiz Online

  • The probability of the occurrence of the event ‘$A$’ is $P(A)=$
  • When the occurrence of an event does affect the probability of the occurrence of another event it is called
  • The probability of a sample space is equal to
  • If three coins are tossed, the all possible cases are
  • If a fair dice is rolled, the sample space is
  • A fair dice is rolled twice, and the probability of getting a sum 8 is
  • If $A$ and $B$ are mutually exclusive events then $P(A \cup B)=$
  • If $A$ and $B$ are not mutually exclusive events then $P(A\cup B)=$
  • If $A$ and $B$ are independent events then $P(A\cap B)=$
  • If $A$ and $B$ are dependent events then $P(A \cap B) =$
  • The probability of drawing a picture card from a pack of 52 cards is
  • The probability of drawing a diamond card from a pack of 52 cards is
  • The probability of drawing a ball at random from the box is
  • For two mutually exclusive events $A$ and $B$, $P(A) = 0.3$ and $P(B)=0.5$ then $P(A \cup B)$ is
  • If $P(B|A)=0.30$ and $P(A \cap B)=0.12$ then $P(A)$ is
  • The probability of an event $A$ lies between
  • If $P(A \cap B) = \phi$ then $P(A \cup B)=$ ————-.
  • When an event is certain to occur, its probability is
  • Baye’s Theorem
  • The term “Sample Space” is used for
MCQs Probability Quiz Online

https://gmstat.com, https://rfaqs.com

A Quick Overview of Probability

The post is a quick overview of probability. Probability theory is a branch of mathematics that deals with the occurrence of random events. It provides a framework for quantifying uncertainty and making predictions based on available information.

Overview of Probability

The probability can be classified into two points of view:

Objective probability requires a computational formula, while subject probability can be derived from experience, judgment, or previous knowledge about the event. In this post, I will focus on an overview of probability and the Laws of Probability.

Objective Probability

The objective probability has the following definitions

Classical and a Priori Definition

$$P(A) = \frac{\text{Number of sample points in the event based on the favorable events}}{\text{Number of sample points in the sample space}} = \frac{m}{n} = \frac{n(A)}{n(S)}$$

The Relative Frequency or a Posteriori Definition

$$P(A) = \lim\limits_{n\rightarrow \infty} \frac{m}{n}$$

This definition assumes that as $n$ increases to infinity, $\frac{m}{n}$ becomes stable.

When we experiment with the same condition many times, the probability of favourable event becomes stable. For example, if we toss a coin 10 times, then 100 times, then 1000 times, then 10,000 times, then 100,000 times, then 1000,000, and so on. We are also interested in the various numbers of heads that occur. Let $H$ (occurrence of head) be our favorable event, and the probability of a favorable event is called the probability of success. Then the definition said that there are approximately 50% heads in one million tosses. This definition is also called the empirical or statistical definition of probability. that is more useful in practical problems. In practical problems, we find the winning percentage of a team.

The axiomatic Definition of Probability

An axiom is a statement, about any phenomenon, which is used to find real-world problems.

The axiomatic definition of probability states that if a sample space $S$ with sample points $E_1, E_2, \cdots, E_n$, then a real number is assigned to each sample point denoted by $P(E_i)$, should satisfy the following conditions:

  • for any event ($E_i$), $0< P(E_i) <1$
  • $P(S) = 1$, sure event
  • If $A$ and $B$ are two mutually exclusive events, then $P(A\cup B) = P(A) + P(B)$

Laws of Probability

For computing the probability of two or more events, the following laws of probability may be used.

Law of Addition

  • For mutually exclusive events: $P(A\cup B) = P(A) + P(B)$
  • For non-mutually exclusive events: $P(A\cup B) = P(A) + P(B) – P(A\cap B)$

If $A$, $B$, and $C$ are three events in a sample space $S$, then

$P(A\cup B \cup C) = P(A) + P(B) + P(C) – P(A \cap B) – P(B\cap C) – P(A \cap C)$

Law of Multiplication

For independent events $A$ and $B$: $P(A \text{ and } B) = P(A) \times P(B)$

For dependent events $A$ and $B$: $P(A \text { and } B) = P(A) \times P(B|A)$ (where $P(B|A)$ is the conditional probability of $B$ given $A$)

Law of Complementation

If $A$ is an event and $A’$ is the complement of that event, then

$P(A’) = 1-P(A)$, Note that $P(A) + P(A’) = 1$

Probability of sub-event

If $A$ and $B$ are two events in such a way that $A \subset B$, then $P(A) \le P(B)$

If $A$ and $B$ are any two events defined in a sample space $S$, then

$P(A\cap B’) = P(A) – P(A\cap B)$

Conditional Probability

$P(A|B) = \frac{P(A\cap B}{P(B)}$ or $P(B|A) = \frac{P(A\cap B}{P(A)}$.

Example of Conditional Probability

If we throw a die, what is the probability of 6? That is, $\frac{1}{6}$. What is the probability of 6 given that all are even numbers?

When a die is rolled, the sample space is $S=\{1, 2, 3, 4, 5, 6\}$. Let denote the even numbers by $B$, that is, $B=\{2, 4, 6\}$

$P(A|B) = \frac{1}{3}$

Law of Total Probability

If events $B_1, B_2, \cdots, B_n$ are mutually exclusive and exhaustive events, then for any event $A$: $P(A) = P(A|B_1) \times P(B_1) + P(A|B_2) * P(B_2) + \cdots + P(A|B_n) \times P(B_n)$

Bays’s theorem

Bays’ there is used to update probabilities based on new information.

If $A_1, A_2, \cdots, A_k$ are many events in a sample space.

$P(A_i|B) = \frac{P(A_i) P(B|A_i)}{\Sigma P(A)_i P(B|A_i)}, \text{ for } i, 1, 2, 3, \cdots, k$

Quick Overview of Probability

https://rfaqs.com, https://gmstat.com

Probability Distribution Discrete Random Variable

A probability distribution for a discrete random variable $X$ is a list of each possible value for $X$ with the probability that $X$ will have that value when the experiment is run. The likelihood for the probability distribution of a discrete random variable is denoted by $P(X=x)$. The probability distribution of a discrete random variable is also called a discrete probability distribution.

A discrete probability distribution is a mathematical function that assigns probabilities to each possible value of a discrete random variable.

Example of Probability Distribution of a Discrete Random Variable

Let $X$ be a random variable representing the number of trials obtained when a coin is flipped three times in an experiment. The sample space of the experiment is:

$$HHH, HHT, HTH, THH, HTT, TTH, THT, TTT$$

where $T$ represents the occurrence of Tail and $H$ represents the occurrence of Head in the above experiment.

Then $X$ has 4 possible values: $0, 1, 2, 3$ for the occurrence of head or tail. The probability distribution for $X$ is given as below:

$X$$P(X)$
0$\frac{1}{8}$
1$\frac{3}{8}$
2$\frac{3}{8}$
3$\frac{1}{8}$
Total$1.0$

In a statistics class of 25 students are given a 5-point quiz. 3 students scored 0; 1 student scored 1, 4 students scored 2, 8 students scored 3, 6 students scored 4, and 3 students scored 5. If a student is chosen at random, and the random variable $S$ is the student’s Quiz Score then the discrete probability distribution of $S$ is

$S$$P(S)$
00.12
10.04
20.16
30.32
40.24
50.12
Total1.0

Note that for any discrete random variable $X$, $0\le P(X) \le 1$ and $\Sigma P(X) =1$.

Finding Probabilities from a Discrete Probability Distribution

Since a random variable can only take one value at a time, the events of a variable assuming two different values are always mutually exclusive. The probability of the variable taking on any number of different values can thus be found by simply adding the appropriate probabilities.

discrete and continuous probability distributions, discrete random variable

Mean or Expected Value of a Discrete Random Variable

The mean or expected value of a random variable $X$ is the average value that one should expect for $X$ over many trials of the experiment in the long run. The general notation of the mean or expected value of a random variable $X$ is represented as

$$\mu_x\quad \text{ or } E[X]$$

The mean of a discrete random variable is computed using the formula

$$E[X]=\mu_x = \Sigma x\cdot P(X)$$

Example 1

From the above experiment of three Coins the Expected value of the random variable $X$ is

$X$$P(X)$$x.P(X)$
0$\frac{1}{8}$$0 \times \frac{1}{8} = 0$
1$\frac{3}{8}$$1 \times \frac{3}{8} = \frac{3}{8}$
2$\frac{3}{8}$$2 \times \frac{3}{8} = \frac{6}{8}$
3$\frac{1}{8}$$3 \times \frac{1}{8} = \frac{3}{8}$
Total$1.0$$\frac{3}{2} = 1.5$

Thus if three coins are flipped a large number of times, one should expect the average number of trials (per 3 flips) to be about 1.5.

Discrete Random Variable, discrete probability distributions

Example 2

Similarly, the mean of the random variable $S$ from the above example is

$S$$P(S)$$S\cdot P(S)$
00.12$0 \times 0.12 = 0$
10.04$1 \times 0.04 = 0.04$
20.16$2 \times 0.16 = 0.32$
30.32$3 \times 0.32 = 0.96$
40.24$4\times 0.24 = 0.96$
50.12$5 \times 0.12 = 0.60$
Total$1.0$$2.88$

Note that $2.88$ is the class average on the statistics quiz as well.

Variance and Standard Deviation of a Random Variable

One may be interested to find how much the values of a random variable differ from trial to trial. To measure this, one can define the variance and standard deviation for a random variable $X$. The variance of $X$ random variable is denoted by $\sigma^2_x$ while the standard deviation of the random variable $X$ is just the square root of $\sigma^2_x$. The formulas of variance and standard deviation of a random variable $X$ are:

\begin{align*}
\sigma^2_x &= \Sigma (x – \mu)^2 P(X)\\
\sigma_x &= \sqrt{\Sigma (x – \mu)^2 P(X)}
\end{align*}

Note that the standard deviation estimates the average difference between a value of $x$ and the expected value.

Calculating the Variance and Standard Deviation

The calculation of standard deviation for a random variable is similar to the calculation of weighted standard deviation in a frequency table. The $P(x)$ can be thought of as the relative frequency of $x$. The computation of variance and standard deviation of a random variable $X$ can be made using the following steps:

  1. Compute $\mu_X$ (mean of the random variable)
  2. Subtract the mean/average from each of the possible values of $X$. These values are called the deviations of the $X$ values.
  3. Square each of the deviations calculated in the previous step.
  4. Multiply each squared deviation (calculated in step 3) by the corresponding probability $P(x)$.
  5. Sum the results of step 4. The variance of the random variable will be obtained representing $\sigma^2_X$.
  6. Take the square root of the $\sigma^2_X$ computed in Step 5.

Importance of Discrete Probability Distributions

  • Modeling Real-World Phenomena: Discrete Distributions help us understand and model random events in various fields of life such as engineering, finance, and the sciences.
  • Decision Making: These distributions provide a framework for making informed decisions under uncertainty.
  • Statistical Inference: These are used to make inferences about populations based on sample data.

FAQs about the Probability Distribution of a Discrete Random Variable

  1. Define the probability distribution.
  2. What is a random variable?
  3. What is meant by an expected value or a random variable?
  4. What is meant by the variance and standard deviation of a random variable?

https://rfaqs.com, https://gmstat.com