Unit-III Probability and Random Variables

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Unit-III

Experiment: An Experiment is a physical activity which has several resultants


known as outcomes.
Ex: Tossing a coin and Measuring the length of the table.

Experiments is two types: (i) Deterministic Experiment, (ii) Indeterministic


Experiment or Random Experiment.

Deterministic Experiment: An experiment whose outcome or result can be


predicted with certainty is called a Deterministic Experiment.

Ex: Measuring the length of a table and Measuring the width of the wall.

Indeterministic Experiment: Although all possible outcomes of an experiment


may be known in advance, the outcome of a particular performance of the
experiment cannot be predicted is called an Indeterministic Experiment.

Ex: Tossing two coins and Throwing a die.


Or
Random Experiment: An experiment which holds:

a) The experiment can be repeated under identical conditions.


b) We have the knowledge of all possible outcomes in advance.
c) Any trial of the experiment results in an outcome that is not known in advance.

Trail: A single performance of an experiment is called Trail. Each trial results in


one or more outcomes.

Ex: Tossing three coins and playing pack of cards.

Outcome: The resultant of a random experiment is known as outcome.

Ex: In tossing a coin, head and tail are outcomes.

In throwing a die, 1, 2, 3, 4, 5, 6 points on the face of a die are outcomes

Sample Space: The set of all possible outcomes in a random experiment is called
sample space. It denoted by S or Ω.

Ex: In tossing two coins random experiment, the sample space

Ω = {HH, HT, TH, TT}


Event: An event is a set of single or many outcomes of a random experiment.
Event is a subset of sample space S and is denoted by E.

Ex: In tossing two coins, getting at least one head, i.e., {HH, HT, TH}

In throwing two dice, getting same points on the two faces of the dice may be
an event. E = {(1, 1), (2, 2), (3, 3), (4, 4), (5, 5), (6, 6)}

Simple Event: An event which cannot be broken or divided further into smaller
events.

Ex: In tossing a coin, head and tail are simple event.

Compound (Composite) Event: A combination of several simple events. A


compound event is can be further divided into smaller events.

Ex: In throwing a die, getting an odd number is a compound event i.e., E= {1, 3, 5}

Exhaustive Events: The total number of possible outcomes in a random


experiment is known as exhaustive events.

Ex: In tossing a coin, head and tail are exhaustive events.

Equally Likely Events: Let S is a finite sample space then its sample points are
called Equally Likely if there is no reason to expect any sample space to occur in
preference to other sample points.

Ex: In throwing an unbiased die, all the six faces are equally likely to come.

Mutually Exclusive (Disjoint) Events: If the happening of any one of them


precludes the happening of all others.

Ex: In throwing a die, getting 1, 2, 3, 4, 5, 6 are mutually exclusive events.

Independent Event: If the occurrence of one of them does not influence the
occurrence of other, otherwise they are dependent.

Favourable Events: The number of cases favourable to an event in a trial.

Mathematical or Classical or Apriori Definition of Probability (Laplacian):


If a random experiment or a trial results in ‘n’ exhaustive, mutually exclusive and
equally likely outcomes (cases), out of which ‘m’ are favourable to the occurrence
of an event A, then the probability ‘p’ of occurrence (happening) of A, denoted by
𝑵𝒖𝒎𝒃𝒆𝒓 𝒐𝒇 𝒇𝒂𝒗𝒐𝒖𝒓𝒂𝒃𝒍𝒆 𝒄𝒂𝒔𝒆 𝐦
P (A). 𝑷(𝑨) = =
𝑻𝒐𝒕𝒂𝒍 𝒏𝒖𝒎𝒃𝒆𝒓 𝒐𝒇 𝒄𝒂𝒔𝒆𝒔 𝐧

Statistical or Aposteriori or Emperical Probability (Von Mises): Let a random


experiment be repeated n times and let an event A occur 𝑛𝐴 times out of the 𝑛 trials.
𝑛 𝑛
The ratio 𝐴 is called the relative frequency of the event A. As ‘𝑛’increases 𝐴
𝑛 𝑛
shows a tendency to stabilize and to approach a constant value. This value is
denoted be 𝑃(𝐴).
𝒏𝑨
𝑷(𝑨) = 𝐥𝐢𝐦 ( )
𝒏→∞ 𝒏

Axiomatic Definition of Probability: Let S be the sample space and A be an


event associated with a random experiment. Then the probability of the event A,
denoted by 𝑃(𝐴), is defined as a real number satisfying the following axioms.
1. 𝟎 ≤ 𝑷(𝑨) ≤ 𝟏
2. 𝑷(𝑺) = 𝟏
3. If A and B are mutually exclusive events,
𝑷(𝑨 ∪ 𝑩) = 𝑷(𝑨) + 𝑷(𝑩)

❖ Probability: Probability is a measure of uncertainty.


❖ Properties
a. Probability lies between 0 to 1,
b. Probability is always a positive number,
c. Total sum of Probability is 1.
d. For any event A, Probability of the complementary event of A is
P(𝐴̅) = 1- P(A) => P(𝐴̅) + P(A) = 1
e. If A and B are mutually disjoint events, then

𝑃(𝐴𝑈𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 0, 𝐴 𝑎𝑛𝑑 𝐵 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡

f. For any two events A and B, 𝑃(𝐴𝑈𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵)

• Certain:
If something is certain to happen, then it has a probability of 1.

Ex: It is certain that the sun will rise tomorrow.


Theorems on Probability
1. Probability of the impossible event is zero. i.e., P (∅) = 0
Proof. We know, S U ∅ = S
P (S U ∅) = P (S)
P(S) + P (∅) = P (S)
1 + P (∅) = 1
P (∅) = 0
2. Probability of the complementary event A is given by P (𝐴̅) = 1–P (A)
Proof:
We know, A U 𝐴̅ = S
P (A U 𝐴̅) = P(S)
P (A) + P (𝐴̅) = 1
P (𝐴̅) = 1-P(A)
3. For any two events A and B
P (𝐴̅ ∩ 𝐵) = P(B) – P (A∩ B)
Additional Theorem of Probability: If A and B are any two events, then
𝑃(𝐴 𝑈 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵)
Proof:

Express 𝐴𝑈𝐵 as union of two disjoint events 𝐴 and 𝐴̅ ∩ 𝐵 as shown in the venn
diagram. ̅ ∩ B)
A ∪ B = A ∪ (A
Consider probabilities,
̅ ∩ B))
P(A ∪ B) = P(A ∪ (A
̅ ∩ B) [ ∵ from additivity]
= 𝑃(𝐴) + 𝑃(A
Add and subtract 𝑃(𝐴 ∩ 𝐵)
̅ ∩ B) + 𝑃(𝐴 ∩ 𝐵) − 𝑃(𝐴 ∩ 𝐵)
∴ P(A ∪ B) = 𝑃(𝐴) + 𝑃(A
̅ ∩ B) ∪ 𝑃(𝐴 ∩ 𝐵)} − 𝑃(𝐴 ∩ 𝐵)
= 𝑃(𝐴) + 𝑃{(A
̅ ∩ B) and 𝐴 ∩ 𝐵 are disjoint events]
[ ∵ (A
= 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵) [from venn diagram]
∴ 𝑃(𝐴 𝑈 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵).
𝐧(𝐀∩𝑩)/𝒏(𝒔)
Conditional Probability: 𝑷(𝐵/𝐴) =
𝒏(𝑨)/𝒏(𝒔)
Baye’s Theorem: If E1, E2, … , En are n mutually disjoint events with

P(Ei) , i = 1, 2, …, n then for any arbitrary event A which is a subset of ⋃𝑛𝑖=1(𝐸𝑖 )


such that P(A) > 0, we have
𝐴
𝐸𝑖 𝑃 ( ) ∗ 𝑃(𝐸𝑖 )
𝐸𝑖
𝑃( ) = 𝐴
, 𝑖 = 1,2, … , 𝑛
𝐴 ∑𝑛 𝑃 ( ) ∗ 𝑃(𝐸 )
𝑖=1 𝐸𝑖 𝑖

Proof: Since 𝐴 ⊂ ⋃𝑛𝑖=1 𝐸𝑖 , then we know that 𝐴 = 𝐴 ∩ (⋃𝑛𝑖=1 𝐸𝑖 )

𝐴 = 𝐴 ∩ (𝐸1 ∪ 𝐸2 ∪ … ∪ 𝐸𝑛 )

𝐴 = (𝐴 ∩ 𝐸1 ) ∪ (𝐴 ∩ 𝐸2 ) ∪ … ∪ (𝐴 ∩ 𝐸𝑛 )

Since 𝐸1 , 𝐸2 … 𝐸𝑛 are mutually disjoint events, (𝐴 ∩ 𝐸1 ), (𝐴 ∩ 𝐸2 ), … , (𝐴 ∩ 𝐸𝑛 ) are


also disjoint events.

By using additivity axiom probability (𝑃(𝐴 ∪ 𝐵) = 𝑃(𝐴) + 𝑃(𝐵))

𝑃(𝐴) = 𝑃[(𝐴 ∩ 𝐸1 ) ∪ (𝐴 ∩ 𝐸2 ) ∪ … ∪ (𝐴 ∩ 𝐸𝑛 )]

𝑃(𝐴) = 𝑃[(𝐴 ∩ 𝐸1 ) + (𝐴 ∩ 𝐸2 ) + ⋯ + (𝐴 ∩ 𝐸𝑛 )]
𝑛 𝑛

𝑃(𝐴) = ∑(𝐴 ∩ 𝐸𝑖 ) = ∑ 𝑃(𝐴/𝐸𝑖 ) ∗ 𝑃(𝐸𝑖 ) … … … ∗∗∗ (𝑓𝑟𝑜𝑚 … 𝑖𝑖)


𝑖=1 𝑖=1

By conditional probability
𝑃(𝐴 ∩ 𝐵) 𝑃(𝐴 ∩ 𝐵)
𝑃(𝐴/𝐵) = ; 𝑃(𝐵/𝐴) =
𝑛(𝐵) 𝑛(𝐴)

𝑛(𝐴 ∩ 𝐸𝑖 ) 𝑛(𝐴) 𝑛(𝐴 ∩ 𝐸𝑖 ) 𝑛(𝐴)


𝑃(𝐴 ∩ 𝐸𝑖 ) = =
𝑛(𝑠) 𝑛(𝐴) 𝑛(𝐴) 𝑛(𝑠)
= 𝑃(𝐸𝑖 /𝐴) ∗ 𝑃(𝐴) 𝑖

𝑛(𝐴 ∩ 𝐸𝑖 ) 𝑛(𝐸𝑖 ) 𝑛(𝐴 ∩ 𝐸𝑖 ) 𝑛(𝐸𝑖 )


𝑃(𝐴 ∩ 𝐸𝑖 ) = =
𝑛(𝑠) 𝑛(𝐸𝑖 ) 𝑛(𝐸𝑖 ) 𝑛(𝑠)
= 𝑃(𝐴/𝐸𝑖 ) ∗ 𝑃(𝐸𝑖 ) 𝑖𝑖
Also we know that 𝑃(𝐴 ∩ 𝐸𝑖 ) = 𝑃(𝐸𝑖 /𝐴) ∗ 𝑃(𝐴) (𝑏𝑦 𝑐𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛𝑎𝑙 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦)

P(A ∩ 𝐸𝑖 )
𝑃(𝐸𝑖 /𝐴) =
𝑃(𝐴)

𝑃(𝐴/𝐸𝑖 ) ∗ 𝑃(𝐸𝑖 )
𝑃(𝐸𝑖 /𝐴) = 𝑛 𝑓𝑟𝑜𝑚 ∗∗∗ 𝑎𝑛𝑑 𝑖𝑖
∑𝑖=1 𝑃(𝐴/𝐸𝑖 ) ∗ 𝑃(𝐸𝑖 )

Random Variables

Def: Random variable is a real valued function defined on each and every outcome of the sample
space connected with the random experiment.

Ex: consider a random variable experiment of two tosses of a coin. Then the sample space

𝑆 = {𝐻𝐻, 𝐻𝑇, 𝑇𝐻, 𝑇𝑇}

Let 𝑋 be number of heads and can be explained below

𝑋 (𝐻𝐻) = 2, 𝑋 (𝐻𝑇) = 1, 𝑋 (𝑇𝐻) = 1, 𝑋 (𝑇𝑇) = 0

Then the random variable X takes the values 0, 1, 2.

There are two types of random variables: Discrete random variable and Continuous random
variable

Discrete random variable: If a random variable takes at most countable number of values is
called a discrete random variable.

Ex:

1. Number of heads tossing three coins.


2. Number of points turned in the face of a die.
3. The number of birth days in a city.
Probability Mass Function (p.m.f):

Let us consider X be a discrete random variable which assumes the values 𝑥1 , 𝑥2 , … , 𝑥𝑛


with the respective probabilities 𝑝1 , 𝑝2 , … , 𝑝𝑛 .

In general, 𝑃(𝑋 = 𝑥𝑖 ) = 𝑝𝑖 or P(xi) is known as probability mass function it is satisfies the


following properties.

1. 𝑃(𝑥𝑖 ) > 0 for all i


2. ∑𝑛𝑖=1 𝑃 (𝑥𝑖 ) = 1
Probability Distribution of a discrete random variable:
The set (𝑥𝑖 , 𝑃(𝑥𝑖 )) is called the Probability Distribution of a discrete random Variable.

Continuous Random Variable:

If a random variable takes all the possible values between certain limits, is called Continuous
Random Variable.

Ex:

1. Age of group persons.


2. Weight of group persons.
3. The temperature recorded in a city.
Probability density function (p.d.f)

The Probability density function of a continuous random variable is denoted by f(x) or 𝑓𝑋 (𝑥).

The Probability density function f(x) is satisfies the following properties.

1. 𝑓(𝑥) > 𝑜 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥



2. ∫−∞ 𝑓(𝑥) 𝑑𝑥 = 1
3. The probability of an event in the interval (a, b) is calculated by
𝑏
𝑃(𝑎 ≤ 𝑋 ≤ 𝑏) = ∫ 𝑓(𝑥) 𝑑𝑥
𝑎

Distribution Function of a Random Variable

The distribution of a random variable X is denoted by 𝐹𝑋 (𝑥) or 𝐹(𝑥) and is defined as

𝐹(𝑥) = 𝑃(𝑋 ≤ 𝑥)

For discrete random variable

𝐹(𝑥) = 𝑃(𝑋 ≤ 𝑥) = ∑𝑛𝑖=1 𝑃(𝑥i)

For discrete random variable


𝑥
𝐹(𝑥) = 𝑃(𝑋 ≤ 𝑥) = ∫ 𝑓(𝑥) 𝑑𝑥
−∞

Properties of Distribution function

1. If 𝐹(𝑥) is distribution function of a random variable X and if a < b, then

𝑃(𝑎 < 𝑋 < 𝑏) = 𝐹(𝑏) – 𝐹(𝑎)

2. 𝑃(𝑎 < 𝑋 < 𝑏) = 𝐹(𝑏) – 𝐹(𝑎)– 𝑃(𝑋 = 𝑏)


3. 𝑃(𝑎 < 𝑋 < 𝑏) = 𝐹(𝑏) – 𝐹(𝑎) + 𝑃(𝑋 = 𝑎)
4. 𝑃(𝑎 < 𝑋 < 𝑏) = 𝐹(𝑏) – 𝐹(𝑎)– 𝑃(𝑋 = 𝑏) + 𝑃(𝑋 = 𝑎)
5. If 𝐹(𝑥) is the distribution function of a random variable X, then 0 < 𝐹(𝑥) < 1.
6. If 𝐹(𝑥) is the distribution function of a random variable X, then 𝐹(𝑥) < 𝐹(𝑦) for x < y.
7. If 𝐹(𝑥) is the distribution function of a random variable X, then 𝐹(−∞) = 0, 𝐹(∞) = 1.

Various formulae on discrete random variable

1. 𝑀𝑒𝑎𝑛 = ∑ 𝑥. 𝑃(𝑥)
2. 𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = ∑ 𝑥 2 . 𝑃(𝑥) − (𝑚𝑒𝑎𝑛)2
3. 𝐹(𝑥) = 𝑃(𝑋 ≤ 𝑥) = ∑𝑥 𝑃(𝑋 = 𝑥)
Various formulae on continuous random variable

1. Mean = ∫ 𝑥. 𝑓(𝑥) 𝑑𝑥
2. Variance = ∫ 𝑥 2 . 𝑓(𝑥) 𝑑𝑥 − (𝑚𝑒𝑎𝑛)2
3. Harmonic mean H is given by
1 1
= ∫ . 𝑓(𝑥) 𝑑𝑥
𝐻 𝑥
4. Geometric mean G is given by
log 𝐺 = ∫ log 𝑥 . 𝑓(𝑥) 𝑑𝑥

5. Median M is given by (in the range a, b)


𝑀 𝑏 𝑀 𝑏
1 1 1
∫ 𝑓(𝑥) 𝑑𝑥 = ∫ 𝑓(𝑥) 𝑑𝑥 = 𝒊. 𝒆. ∫ 𝑓(𝑥) 𝑑𝑥 = (𝒐𝒓) ∫ 𝑓(𝑥) 𝑑𝑥 =
𝑎 𝑀 2 𝑎 2 𝑀 2

6. Mean deviation about mean


𝑀. 𝐷. = |𝑥 − 𝑚𝑒𝑎𝑛|. 𝑓(𝑥). 𝑑𝑥

7. rth moment about origin


𝜇𝑟 ′ = ∫ 𝑥 𝑟 𝑓(𝑥) 𝑑𝑥

8. rth moment about mean


𝜇𝑟 = ∫(𝑥 − 𝑚𝑒𝑎𝑛)𝑟 𝑓(𝑥) 𝑑𝑥

You might also like