Probability and Statistics I MATH 311

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

1

Probability and Statistics I


MATH 311
Lecture 1
1. Probability
1.1. Sample Spaces and Events
Definition 1.1. An experiment is any action or process whose outcome is subject to uncertainty.
Definition 1.2. The sample space of an experiment, denoted by S, is the set of all possible
outcomes of that experiment.
Definition 1.3. An event is any collection (subset) of outcomes contained in the sample space S.
An event is simple if it consists of exactly one outcome and compound if it consists of more than
one outcome.

SOME RELATIONS FROM SET THEORY


Definition 1.4. The complement of an event A, denoted
are not contained in A.

, is the set of all outcomes in S that

A B
Definition 1.5. The union of two events A and B, denoted by
and read A or B, is the
event consisting of all outcomes that are either in A or in B or in both events, that is, all
outcomes in at least one of the events.
Definition 1.6. The intersection of two events A and B, denoted by
is the event consisting of all outcomes that are in both A and B.

A B

and read A and B,

Definition 1.7. Let


denote the null event (the event consisting of no outcomes whatsoever).
A B
When
, A and B are said to be mutually exclusive or disjoint events.
Definition 1.8. Number of elements in a finite set A is called its cardinality and is denoted as
A

Here are some the set theory identities:


I. Distributive Laws:

A B C A B A C
A B C A B A C
II. De Morgan Laws:

A B A B
A B A B
Example 1.1. Prove that for any sets A and B the following identity holds:
B A B A B ,

where the sets in parentheses are disjoint.


Solution. According to the First Distributive Law:

A B A B B A A B S B.
Example 1.2. Prove that for any sets A and B the following identity holds:
A B A B A ,

where the sets A and

B A

are disjoint.

Solution. We can write


A B A A B A A A S A B.

1.2. Axioms, Interpretations, and Properties of Probability

P A 0

Axiom 1: For any event A,

Axiom 2: P(S) = 1.

A1 , A2 , A3 , K
Axiom 3: If

is an infinite collection of disjoint events, then

P Ai
U i
i 1
i 1

P 0.

Proposition 1.1.
This in turn implies that the property contained in Axiom 3 is valid
for a finite collection of events.

Ai
Proof. First consider the infinite collection

for all natural numbers i. Since

UA

the events in this collection are disjoint and

i 1

The third axiom then gives

P P
i 1

P 0.

This can happen only if

A1 , A2 ,K Ak
Now suppose that

are disjoint events, and

Ak 1 , Ak 2 , Ak 3 , K
append to these the infinite collection
third axiom,

Ai P
U
i 1

Again invoking the

Ai P Ai P Ai
U
i 1
i 1
i 1

as desired.
P A 1 P A .

P A P A 1

Proposition 1.2. For any event A,

from which

A2 A.

A1 A
Proof.

In Axiom 3, let k = 2,

while A and

and

A,

A A S

Since by definition of

1 P S P A A P A P A .

are disjoint,
P A 1.

Proposition 1.3. For any event A,

Proposition 1.4. For any two events A and B,


P A B P A P B P A B .

Proof. First applying Example 1.2 we can write:


P A B P A P B A .

(1)
Now applying Example 1.1:
P B P A B P A B .

or
P B A P B P A B

(2)
Combining (1) and (2) we obtain the claim.
Example 2.14 (page 55).

EQUALLY LIKELY OUTCOMES


S n

In many experiments consisting of n outcomes, so that


. It is reasonable to assign equal
probabilities to all n simple events. With p = P(Ei) foe every i,
n

1 P Ei p pn
i 1

i 1

p
, so

1
n

Now consider an event A, with

denoting the number of outcomes containing in A. Then

P A

P E n
i

Ei A

Ei A

A
A

n
S

.
Example 2.16 (page 57).

1.3. Counting Techniques


PRODUCT RULE
Proposition 1.5. If the first element of an ordered pair can be selected in n1 ways, and for each
of these in n1 ways the second element of the pair pair can be selected in n2 ways, then the
total number of such pairs is n1 n2.
Definition 1.9. We will call an ordered collection of k objects a k-tuple.
Proposition 1.6. Suppose a set consists of ordered collections of k elements (k-tuples) and
there are n1 possible choices for the first element; for each choice of the first element, there are
n2 possible choices for the second element;; for each possible choice of the first k 1
elements, there are nk choices of the kth element. Then there are n1 n2 nk possible k-tuples.

PERMUTATIONS AND COMBINATIONS


r 1
Definition 1.10. A permutation of r elements (where
) from a set of n elements is any
arrangement, without repetition, of the r elements. The number of permutations of n objects
rn
taken r at a time (with
) is written Pn,r.
Proposition 1.7. If Pn,r (where
a time, then

rn

) is the number of permutations of n elements taken r at

Pn, r

n!
(n r )!
.

Example 2.21 (page 63).

Definition 1.11. A subset of r objects selected without regard to order from a set containing n

objects is called a combination, and is denoted

n
r

. It is read n chosen r.

Proposition 1.7. The number of combinations of n elements taken r at a time, where


is
n
n!

r!( n r )!
r

rn

Example 2.22 (page 64).

1.4. Conditional Probability


Assume that E and F are two events for a particular experiment. Assume that the sample space
S for this experiment has n possible equally likely outcomes. Suppese event F has m
km
EF
elements and
has k elements (
). Using the fundamental principle of probability,
P( F )

m
n

P( E F )
and

k
n

P( E F )
We now want to find
, the probability that E occurs given that F has occurred. Since
we assume F has occurred, reduce the sample space to F : look only at the m elements inside
F (see Figure 1.1). Of these m elements, there are k elements where E also occurs, because
EF
has k elements. This makes
P( E F )

Divide numerator and denominator by n to get

k
m

k
P( E F )
P( E F ) n
m
P( F )
n

.
Figure 1.1.

m-k

The last result motivates the definition of conditional probability.

P( A B)
Definition 1.12. The conditional probability of an event A given event B, written

P( A B)

P( A B)
, P( B) 0
P( B)
(3)

Example 2.25 (page 68).


The definition of conditional probability yields the following result, obtained by multiplying
both sides of equation (3) by P(B).
The Multiplication Rule

P( A B) P ( A B) P ( B )

Example 2.27 (page 70).

A1 , A2 ,K , Ak
Definition 1.13. Events
mutually exclusive.

, are mutually exclusive if any pair of these events are


k

U A S.

A1 , A2 , K , Ak
Definition 1.14. Events

, are exhaustive if

i 1

, is

A1 , A2 , K , Ak
Theorem 2.1. (The Law of Total Probability). Let
exhaustive events. Then for any other event B,

, be mutually exclusive and

P B P B Ai P Ai .
i 1

Proof. We can write


k

P B P B S P B U Ai P
i 1

i 1

U B Ai

i 1

i 1

P B Ai P B Ai P Ai .

A1 , A2 , K , Ak
Theorem 2.2. (Bayes Theorem). Let

, be a collection of k mutually exclusive and


P Ai

exhaustive events with prior probabilities


(i = 1,,k) . Then for any other event B for
which P(B) > 0, the posterior probability of Aj given that B has occurred is
P Aj B

P Aj B
P B

P B Aj P Aj
k

P B A P A
i 1

j 1, K k

Example 2.30 (page 73).

1.5. Independence
P( A B ) P A
Definition 1.15. Two events A and B are independent if
otherwise.

P( A B) P A
Proposition 1.8. If
Proof.

P( B A) P B
then

P ( B A)

Example 2.31, 2.32 (page 77).

and are dependent

P A B P( A B) P B

P B .
P A
P A

Proposition 1.9. A and B are independent if and only if

P A B P A P B .
Proof is left as an exercise.
Example 2.33 (page 78).

A1 , A2 , K , An
Definition 1.16. Events

are mutually independent if for every k (k=2, 3, ,n) and

i1 , i2 ,K , ik 1,K , n
every subset of indices

P Ai j .

I ij
j 1
j 1

Example 2.35 (page 79).

You might also like