Probability For Class 12
Probability For Class 12 Notes
Probability For Class 12 covers topics like conditional probability, multiplication rule, random variables, Bayes theorem, etc. Probability is defined as the extent to which an event is likely to occur. It is measured as a number of favourable events to occur from the total number of events. It is to be noted that the probability of an event is always 0 ≤ P_{e} ≤ 1.
For example, when we toss a coin, either we get Head OR Tail, only two possible outcomes are possible (H, T). But if we toss two coins in the air, there could be three possibilities of events to occur, such as both the coins show heads or both shows tails or one shows heads and one tail, i.e.(H, H), (H, T),(T, T).
Probability Class 12 Concepts
- Introduction
- Conditional Probability
- Multiplication Theorem on Probability
- Independent Events
- Bayes’ Theorem
- Random Variables and its Probability Distributions
- Bernoulli Trials and Binomial Distribution
Read more: |
Probability Formula
The probability formula is defined as the likelihood of an event to happen. It is equal to the ratio of the number of favourable results and the total number of outcomes. The formula for the probability of an event to occur is given by;
P(E) = Number of favourable outcomes/Total Number of outcomes |
Conditional Probability
Conditional Probability is the likelihood of an event or outcome occurring based on the occurrence of a previous event or outcome. It simply depends on any event in the past which has already taken place.
If E and F are two events with the same sample space of a random experiment, then the conditional probability of the event E gives that F has occurred, i.e. P(E|F) is,
P(E|F) = P(E ∩ F)/P(F), provided P(F) ≠ 0
Properties of Conditional Probability
Let E and F be events of a sample space S of an experiment, then;
Property 1: P(S|F) = P(F|F) = 1
Property 2: If A and B are two events in a sample space S and F is an event of S, such that P(F)≠0, then;
P((A ∪ B)|F) = P(A|F) + P(B|F) – P((A ∩ B)|F)
Property 3: P(E′|F) = 1 − P(E|F)
Multiplication Rule
Let E and F be two events associated with a sample space S. Clearly, the set E ∩ F denotes the event that both E and F have occurred. In other words, E ∩ F denotes the simultaneous occurrence of the events E and F. The event E ∩ F is also written as EF. According to this rule, if E and F are the events in a sample space, then;
P(E ∩ F) = P(E) P(F|E) = P(F) P(E|F)
where P(E) ≠ 0 and P(F) ≠ 0
Multiplication Theorem on Probability
P(E ∩ F) = P(E) P(F|E) = P(F) P(E|F) provided P(E) ≠ 0 and P(F) ≠ 0 Multiplication rule of probability for more than two events P(E ∩ F ∩ G) = P(E) P(F|E) P(G|(E ∩ F)) = P(E) P(F|E) P(G|EF) Similarly, the multiplication rule of probability can be extended for four or more events. |
Independent Events
Two experiments are said to be independent if for every pair of events E and F, where E is associated with the first experiment and F with the second experiment, the probability of the simultaneous occurrence of the events E and F when the two experiments are performed is the product of P(E) and P(F) calculated separately on the basis of two experiments, i.e.,
P (E ∩ F) = P (E).P(F)
Baye’s Theorem
A set of events E_{1} , E_{2} , …, E_{n} is said to represent a partition of the sample space S if
(a) E_{i} ∩ E_{j} = φ, i ≠ j, i, j = 1, 2, 3, …, n
(b) E_{1} ∪ Ε_{2} ∪ … ∪ E_{n} = S and
(c) P(E_{i} )> 0 for all i = 1, 2, …, n.
In other words, the events E_{1}, E_{2}, …, E_{n} represent a partition of the sample space S if they are pairwise disjoint, exhaustive and have nonzero probabilities and A be any event with nonzero probability, then:
\(P(E_{i}|A)=\frac{P(E_{i})P(A|E_{i})}{\sum_{j=1}^{n}P(E_{j})P(A|E_{j})}\)
Theorem of total probability
Let {E_{1}, E_{2}, …, E_{n}) be a partition of a sample space and suppose that each of E_{1}, E_{2}, …, E_{n} has a nonzero probability. Let A be an event associated with S, then
P(A) = P(E_{1}) P (A|E_{1}) + P (E_{2}) P (A|E_{2}) + … + P (E_{n}) P(A|E_{n})
Random Variables and its Probability Distributions
A random variable is a real-valued function whose domain is the sample space of a random experiment.
The probability distribution of a random variable X is the system of numbers:
X: x_{1} x_{2} … x_{n}
P(X): p_{1} p_{2} … p_{n}
where p_{i} > 0,
\(\sum_{i=1}^{n}p_i=1;\ i=1,2,3,…,n\)
The real numbers x_{1}, x_{2}, …, x_{n} are the possible values of the random variable X and p_{i} (i = 1, 2, …, n) is the probability of the random variable X taking the value xi i.e., P(X = x_{i} ) = p_{i}
The mean of a random variable X is also called the expectation of X, denoted by E(X). Thus,
\(E(X)=\mu=\sum_{i=1}^{n}x_ip_i=x_1p_1+x_2p_2+x_3p_3+…+x_np_n\)
Thus, we can write the mean, variance and standard deviation of a random variable X as:
Mean | \(E(X)=\mu=\sum_{i=1}^{n}x_ip_i\) |
Variance | \(\sigma_{x}^{2}=Var(X)=\sum_{i=1}^{n}(x_i-\mu)^2p(x_i)\)
Or σ_{x}^{2} = E(X – µ)^{2} |
Standard deviation | \(\sigma_{x}=\sqrt{Var(X)}=\sqrt{\sum_{i=1}^{n}(x_i-\mu)^2p(x_i)}\) |
Bernoulli Trials and Binomial Distribution
Trials of a random experiment are called Bernoulli trials if they satisfy the following conditions:
(i) There should be a finite number of trials.
(ii) The trials should be independent.
(iii) Each trial has exactly two outcomes: success or failure.
(iv) The probability of success remains the same in each trial.
The probability distribution of number of successes in an experiment consisting of n Bernoulli trials may be obtained by the binomial expansion of (q + p)^{n}. Hence, we can write this distribution of the number of successes X as:
X | 0 | 1 | 2 | … | x | … | n |
P(X) | ^{n}C_{0} q^{n} | ^{n}C_{1} q^{n-1} p^{1} | ^{n}C_{2} q^{n-2} p^{2} | ^{n}C_{x} q^{n-x} p^{x} | ^{n}C_{n} p^{n} |
This probability distribution is known as the binomial distribution with parameters n and p.
From this, the probability of success can be calculated as:
P (X = x) = P(x) = ^{n}C_{x} q^{n-x} p^{x} , x = 0, 1, …, n
Here,
n = Total number of trials
p = probability of success
q = 1 – p
Probability Solved Examples
Question: If P (A) = 0.8, P (B) = 0.5 and P (B|A) = 0.4, find:
(i) P (A ∩ B)
(ii) P (A|B)
(iii) P (A ∪ B)
Solution: Given, P (A) = 0.8, P (B) = 0.5 and P (B|A) = 0.4
(i) We have to find P (A ∩ B)
As we know, by conditional probability,
P(B|A) = P (A ∩ B)/P(A)
Therefore,
P (A ∩ B) = P(B|A).P(A) = 0.4 x 0.8 = 0.32
(ii) We have to find P (A|B)
As we know, by conditional probability,
P(A|B) = P (A ∩ B)/P(B)
P(A|B) = 0.32/0.5 = 0.64
(iii) We have to find P (A ∪ B)
P (A ∪ B) = P(A) + P(B) – P (A ∩ B)
= 0.8 + 0.5 – 0.32
= 0.98
Question 2: If a fair coin is tossed 10 times, find the probability of exactly seven tails.
Solution:
Given that a fair coin is tossed 10 times.
The repeated tosses of a coin are Bernoulli trials.
Let X be the number of heads in an experiment of 10 trials.
Here, X has the binomial distribution with n = 10 and p = 1 2
We know that,
P (X = x) = P(x) = ^{n}C_{x} q^{n-x} p^{x} , x = 0, 1, …, n
Here,
x = 7, n = 10, p = ½, q = 1 – (½) = (½)
P(X = 7) = ^{10}C_{7} (1/2)^{10-7} (½)^{7}
= ^{10}C_{3} (½)^{10}
= 120/1024
= 15/128
Therefore, the probability of getting exactly seven tails is 15/128.
Practice Problems
- It is known that 10% of certain articles manufactured are defective. What is the probability that in a random sample of 12 such articles, 9 are defective?
- Two dice are thrown simultaneously. If X denotes the number of sixes, find the expectation of X.
- A pair of dice is thrown 4 times. If getting a doublet is considered a success, find the probability of two successes.
- A card from a pack of 52 cards is lost. From the remaining cards of the pack, two cards are drawn and are found to be both diamonds. Find the probability of the lost card being a diamond.