Media e Esperancas Est
Media e Esperancas Est
Media e Esperancas Est
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
U3 – Médias e Esperanças
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X(ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X(ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
Taufik Abrão
Department of Electrical Engineering
State University of Londrina, PR – Brazil
0.14
0.12
0.10
0.08
0.06
0.04
Conteúdo
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Médias e Esperanças 3
Esperanças Estatísticas . . . . . . . . . . . . . . . . . . . 4
Momentos Estatísticos 31
0.14
0.12
0.10
0.08
0.06
0.04
0
0 10 20 30
−λ
60
= 1−e −2
70 80
k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
≈ 0.865.
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
Definição . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
Função Característica . . . . . . . . . . . . . . . . . . . . 57
U3 - Exercícios 83
U3 - Exercícios . . . . . . . . . . . . . . . . . . . . 84
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Médias e Esperanças
0.14
0.12
0.10
0.08
0.06
Esperanças Estatísticas
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
aleatórias.
• Por exemplo, suponhamos que se deseja determinar a média de
altura de uma população. Vamos admitir que possamos
obter os dados sobre a altura de todos os N indivíduos, com
precisão de centímetros.
• Dessa forma teremos um número finito, n, de valores possíveis
para a altura, xi, com i variando de 1 a n.
• Se o número de pessoas com altura xi é Ni ⇒ a altura
média é:
N1x1 + N2x2 + . . . + Nnxn N1 N2 Nn
x= = x1 + x2 + . . . + xn ,
N N N N
P
com N = i Ni
0.14
0.12
0.10
0.08
0.06
Ni
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
n
X 2.3 Multiple random variables
If X and Y are random variables, we use the shorthand
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
X = E{X} =
Putting all of our shorthand together, we can write
i=1
o qual é equivalente a:
Z ∞ Z 0
E{X} = xfX (x)dx + xfX (x)dx.
−∞
Z0 ∞ Z 0
= [1 − FX (x)]dx − FX (x)dx.
0 −∞
Taufik Abrão 6 Londrina, 3 de Maio de 2018
70 Introduction to discrete random variables
74
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0 x
Figure 4.2 The expectation of the RV X is the difference of the shaded regions.
Observações:
provided the integral exists. We may also write
! ∞ ! 0
1. se fX (x) for uma função par, então E{X} = 0;
µX = x f X (x) d x + x f X (x) d
2. se fX (x) for simétrica em relação a x = !a,0 então E{X} =−∞
a;!
∞ 0
3. v.a X pode nunca assumir o valor X em
= nenhum
[1 − Fexperimento.
X (x)] d x − FX (
0 −∞
Taufik Abrão 7 Londrina, 3 de Maio de 2018
70 Introduction to discrete random variables
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
∞
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
Z = 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
E{X} = [1 − FX (t)]dt
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0
⇒ X contínua, não-negativa; FX (x): função cumulativa de X
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Z b
b2 − a2
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
1 b+a
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
E{X} = X = x dx = =
a b−a 2(b − a) 2
⇒ exatamente o ponto médio no intervalo [a, b]
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Z ∞ {ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
1
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
2 2
x √ e−(x−m) /2σ dx = m
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
E{X} = X =
= P({X ∈ B} ∩ {Y ∈ C}).
−∞ σ 2π
x−m
Dica: adotando z = σ e o resultado da Figura abaixo:
1
2
e−0.5z
0.8 2
z.e−0.5z
0.6
0.4
0.2 2 /2
z · e−z
f(z)
é uma função
0 ímpar.
−0.2
−0.4
−0.6
−8 −6 −4 −2 0 2 4 6 8
z
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
n
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
1
2.3 Multiple random variables
If X and Y are random variables, we use the shorthand
X which is equal to
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
x= xi
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
(3)
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
n
= P({X ∈ B} ∩ {Y ∈ C}).
i=1
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
Considere a Variância de X
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Var[X] = E X − E[X]
2.3 Multiple random variables
If X and Y are random variables, we use the shorthand
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
sendo o argumento
n n
1X 1X
X − E[X] = X − µX = (Xi − µX ) = Yi
n n
i=1 i=1
0.14
0.12
0.10
0.08
0.06
2
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
2
= 1 − e−2 (1 + 2) ≈ 0.594.
Var[X] = (8)
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
n
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
v1 = v ( t1 ) = {v ( t1 , Ei ) , todo i}
0.02
v2 = v ( t 2 ) = {v ( t 2 , Ei ) , todo i}
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
v ( t , E1 )
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
v ( t , E2 )
t
v ( t , E3 ) t
v ( t , E4 ) t
t1 t2
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0
0 10 20 30
P(X ≥ 1) = 1 − P(X = 0) = 1 − e
40 50
−λ
60
= 1−e −2
70 80
k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Z ∞ which is equal to
We also have
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
−∞
sendo:
X
yfY (y)dy = yP{y ≤ Y < y + dy} = y P{xi ≤ X < xi + dxi}
i
X
= g(xi)fX (xi)dxi
i
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
y + ∆y
y
y = g(x)
x1 + ∆x1 x2 + ∆x2
x1 x2 x x3 x3 + ∆x3
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Seja um gerador senoidal com saída A cos ωt. Essa saída é então amostrada
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
em instantes aleatórios. A saída amostrada constitui então uma v.a., que which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
We also have
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Seja a v.a. X cuja função densidade é dada por ∼ N (m, σ). Mostre que E[X] =
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
(x−m)2
Z ∞ Z ∞ −
e 2σ 2
E[X − m] = (x − m)fX (x)dx = (x − m) √ dx
−∞ −∞ σ 2π
x−m
mudança variável: y = σ ⇒ σdy = dx
Z ∞
1 − y2
2
∴ E[X − m] = σ × √ ye dy = 0
2π −∞
| {z }
primeiro momento de N [0,1]
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
com a mesma mudança variável, resulta: Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
Z ∞ = 1 − e−2 (1 + 2) ≈ 0.594.
1 2
2 − y2
2.3 Multiple random variables
E[(X − m)2] = σ 2 × √ 2
If X and Y are random variables, we use the shorthand
y e dy = σ
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
∴ which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
2π
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
−∞
= P({X ∈ B} ∩ {Y ∈ C}).
| {z }
segundo momento de N [0,1]
0.14
0.12
0.10
0.08
0.06
0.04
0.02
2
0 k
0 10 20 30 40 50 60 70 80
2 2
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
Z ∞
1 2
E{Y } = Y = x2 √ e−0.5(x/σ) dx
−∞ σ 2π
x
Com z = σ e admitindo σ > 0, resulta:
Z ∞
1 2
E{Y } = σ 2 √ z 2e−z /2dz
2π −∞
O termo entre parênteses pode ser reduzido a
Z ∞
1 2
√ e−z /2dz = 1
2π −∞
com o auxílio da integração por partes. Finalmente:
E{Y } = E{X 2} = σ 2
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
m:
Z ∞ Z ∞
E{Y } = ··· g(x1, · · · , xm)fX (x1, · · · , xm)dx1, · · · , dxm.
−∞ −∞
(12)
0.14
0.12
0.10
0.08
0.06
0.04
Média da Soma
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
então
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
Casos especiais:
0.14
0.12
0.10
0.08
0.06
0.04
Média do Produto
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Casos especiais:
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
é expressa por:
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
1 − (x−a)2+(y−b)
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
2 = P({X ∈ B} ∩ {Y ∈ C}).
fXY (x, y) = e 2σ 2
2πσ 2
A direta substituição na Eq. (12)
Z ∞Z ∞
E{Z} = g(x, y)fXY (x, y)dxdy
−∞ Z−∞ Z
∞ ∞
1 −
(x−a)2 +(y−b)2
= 2
x·y·e 2σ 2 dxdy
2πσ −∞ −∞
Z ∞ Z ∞
1 −
(x−a)2 1 −
(y−b)2
= √ x · e 2σ2 dx × √ y · e 2σ2 dy
σ 2π −∞ σ 2π −∞
= a·b (vide resultado Exerc. anterior )
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Y de ordem n:
∞
R∞
xfXY (x, Y )dx
Z
−∞
E{X|Y } = xfX (x|Y )dx = . (20)
−∞ fY (Y )
que é uma v.a. função do vetor aleatório Y , cuja média
pode ser então calculada por:
R∞
Z∞ Z∞ xfXY (x, y)dx
−∞
E{E{X|Y }} = E{X|y}fY (y)dy = fY (y)dy
fY (y)
−∞ −∞
Z∞ Z∞
= xfXY (x, y)dxdy = E{X} (21)
−∞ −∞
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Momentos Estatísticos
Definição 0.1 Momento de ordem k da v.a. X:
Z ∞
k
mk = E{X } = X k = xk fX (x)dx. (22)
−∞
e os momentos centrais de ordem k por
Z ∞
µk = E{(X − X)k } = (X − X)k = (X − X)k fX (x)dx. (23)
−∞
0.14
0.12
0.10
0.08
0.06
⇒ m0 = 1; m1 = E{X};
0.04
µ0 = 1; µ1 = 0
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
desvio padrão: σx
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
k We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
X k
Relação entre mom. e mom. central: µk = (−1)r mr1mk−r
r
r=0
(24)
0.14
0.12
0.10
0.08
0.06
0.04
0
0 10 20 30
−λ
60 70 80
k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Covariância de X e Y
−2
P(X ≥ 1) = 1 − P(X = 0) = 1 − e = 1−e ≈ 0.865.
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
µ11 = E (X − X) · (Y − Y ) = σXY
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
(27)
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Propriedades
0.14
0.12
0.10
0.08
0.06
Prova
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
n 2o
E λ(X − X) − (Y − Y ) ≥ 0, λ∈R
0.14
0.12
0.10
0.08
0.06
σx2
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
satisfazer ∆ ≤ 0:
2
∆ = 4σxy − 4σx2 σy2 ≤ 0 ⇒ σxy 2
≤ σx2 σy2 ∴ |ρ| ≤ 1
II. Caso |ρ| = 1. Considere ainda a expressão não negativa acima,
n 2o
µ11 σXY
E λ(X − X) − (Y − Y ) ≥ 0, com λ = m20
= 2 ,
σX
então: n 2o
E λ(X − X) − (Y − Y ) =0 (30)
ou ( )
2
σXY σXY
=E 2 X + Y − 2 X −Y =0 (31)
σX σX
ou equivalentemente
Z ∞ Z ∞ 2
µ11
C(λ) = (X − X) − (Y − Y ) fXY (x, y)dxdy= 0
−∞ −∞ m20
Taufik Abrão 37 Londrina, 3 de Maio de 2018
70 Introduction to discrete random variables
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
µ11
(X − X) − (Y − Y )= 0
m20
Conclusão: da eq. (30) tem-se que quando |ρ| = 1, então Y será
combinação linear de X, i.e., Y = aX + b. Aplicado-se a (31)
resulta finalmente:
σXY σXY µ11 µ11
Y = 2 X+ Y − 2 X = X + my − mx
σX σX m20 m20
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
X e Y é dada por
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
2 2
= σ 2. Admitindo-se a transformação:
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
We also have
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
X = U + W1
Y = W2 − U, sendo U um sinal
1. encontre o coeficiente de correlação entre X e Y . Expresse-o em função da
2]
relação sinal-ruído de U , i.e., SN R , E[U
σ2
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
2 2 2
E{W12} 2
Similarly,
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Y = cos X
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Aleatório X Real
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
CX = E (X − X)(X − X)T
(32)
Z ∞ Z ∞
= ··· (X − X)(X − X)T fX (x)dx
−∞2 −∞
σ1 σ12 · · · σ1n
σ21 σ22 · · · σ2n
= . .. . . . ..
.
σn1 σn2 · · · σn2
0.14
0.12
0.10
0.08
0.06
0.04
Independência, Correlação e
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Ortogonalidade
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
0.14
0.12
0.10
0.08
0.06
0.04
3
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
2 which is equal to
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
0.8 {ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
We also have
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
0.6
1
0.4
0 0.2
0
y
−1 −0.2
−0.4
−2
−0.6
−0.8
−3
−1
−4
−4 −3 −2 −1 0 1 2 3 −1 −0.5 0 0.5 1
x x
0.14
0.12
0.10
0.08
0.06
Conseqüências:
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
A. Desigualdade de Tchebvcheff
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
σ2 σ2
1 − P (|X − µ| ≤ ) ≤ 2 , ⇒ P (|X − µ| ≤ ) ≥ 1 − 2 ,
σ 2
∴ P(µ − ≤ X ≤ µ + ) ≥ 1 − (37)
sendo próxima a 1 para σ <<
Prova:
Z ∞ Z
σ2 , (x − X)2 · fX (x)dx ≥ (x − X)2 · fX (x)dx ≥
−∞
|x−X|≥kσ
Z
≥ (kσ)2 fX (x)dx = k 2σ 2 P(|X − µ| ≥ kσ)
|x−X|≥kσ
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
E{Y }
P(Y ≥ α) ≤ (38)
α
Prova:
Z ∞ Z ∞ Z ∞
E{Y } = y · fY (y)dy ≥ y · fY (y)dy ≥ α fY (y)dy
0 α α
= αP(Y ≥ α)
0.14
0.12
0.10
0.08
0.06
0.04
C. Chernoff Bound
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
disponíveis
⇒ possível obter limitantes mais apertados que as desigualdades
de Chebyshev e Markov.
Considere a desigualdade de Markov novamente:
◦ região de interesse é o conjunto A = {Y ≥ α}
◦ função indicador
1, se y ∈ A
Ia(y) =
0, c.c.
0.14
Introduction to discrete random variables
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
es(t ! a)
−λ −2
P(X ≥ 1) = 1 − P(X = 0) = 1 − e = 1−e ≈ 0.865.
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
0 a
FIGURE 4.15
Figura 4: Bounds função indicador Ia(y) para A = {Y ≥ α}
Bounds on indicator function for A = 5t Ú a6.
Assim, tem-se:
Z ∞ Z ∞
y E{Y }
Pr(Y ≥ α) = Ia(y)·fY (y)dy ≤ ·fY (y)dy =
0 0 α α
Alterando o limitante superior sobre a função indicador Ia(y)
⇒ obtém-se diferentes bounds sobre Pr(Y ≥ α):
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Ia(y) ≤ es(y−α),
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
which is equal to
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
Chernoff Bound:
Z ∞ Z ∞
Pr(Y ≥ α) = Ia(y) · fY (y)dy ≤ es(y−α) · fY (y)dy
0 0
Z ∞
= e−sα esy · fY (y)dy = e−sα · E{esY } (39)
0
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Função Geradora de
Momentos
0.14
0.12
0.10
0.08
0.06
0.04
Definição
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
sX
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
(sX)2 (sX)n
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
sX
If X and Y are random variables, we use the shorthand
E{e } = E 1 + sX +
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
+ ... + + ...
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
2! n!
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
s2 sn
= 1 + s m1 + m2 + . . . + mn + . . .
2! n!
onde mn é o n− ésimo momento estatístico da v.a. X, eq. (22)
dk
Φ(k)
x (0) , k Φx(s) ⇒ mk = Φ(k)
x (0), k = 0, 1, . . . (43)
ds s=0
1
Por exemplo, nenhum momento existe para a distribuição e Cauchy.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Gaussiana
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
0.14
0.12
0.10
0.08
0.06
Função Característica
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
discreta é obtida:
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
jωxi
Putting all of our shorthand together, we can write
Mx(ω) = e · PX (xi)
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
(45)
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
(47)
i.e., o operador convolução ∗ é aplicado de forma recorrente.
• Uma vez que o cálculo de (47) pode ser tedioso,
alternativamente, pode-se então empregar o Teorema
Convolução de Fourier:
a FT de dois ou mais sinais convoluidos é equivalente ao
produto dos sinais no domínio transformado:
R∞ Convolução no Tempo
v1(t) ∗ v2(t) = −∞ v1 (τ ) v2 (t − τ ) dτ V1 (f ) V2 (f )
Convolução na Frequência R∞
v1 (t) v2 (t) V1 ∗ V2 = −∞ V1 (λ) V2 (f − λ) dλ
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Exemplo – Função Característica de uma v.a. Gaussiana Obtenha a Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
σ2 · ω2
Mx(ω) = exp jωµ −
2
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Exemplo – Função Característica para a Soma de v.a. Dada uma v.a. Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
We also have
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
We also have
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
−∞ −∞
= My (ω) · Mx(ω)
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
pdf de
Xn
Z= Xi
i=1
A pdf de Z é dada por (47) e está relacionda à funçao característica através
do Teo Convolução de Fourier:
fZ (z) = fX1 (z) ∗ fX2 (z) ∗ . . . ∗ fXn (z)
m
Mz (ω) = Mx1 (ω) × Mx2 (ω) × . . . × Mxn (ω)
Adicionalmente, uma vez que as n v.a. Xi são i.i.d. N (0, 1), a função característica
de todas as v.a. são idênticas:
Mx(ω) , Mx1 (ω) = Mx2 (ω) = . . . = Mxn (ω)
sendo: Z ∞
1 2
− x2
Mx(ω) = √ e ejωxdx
2π −∞
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Completando o quadrado no expoente acima: Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
Z ∞ = 1 − e−2 (1 + 2) ≈ 0.594.
1 1 2 2 2
2.3 Multiple random variables
Mx(ω) = √
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
2π −∞ Z
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
∞
− ω2
2 1 1 2
= e ×√ e− 2 (x−jω) dx
2π −∞
| {z }
pdf Gaussiana
A integral da pdf Gaussiana, com média jω, no intervalo [±∞] resulta igual a 1.
Portanto,
2
− ω2
Mx(ω) = e
e a correspondente função característica de Z vale:
2
n − nω2
Mz (ω) = [Mx(ω)] = e
Inspecionando Mz (ω) deduz-se que fZ (z) deve ser Gaussiana. Utilizando a fórmula
de inversão, (46) obtém-se:
Z ∞ Z ∞
1 −jωz 1 − nω2
2
fZ (z) = MZ (ω) · e dω = e · e−jωz dω
2π −∞ 2π −∞
0.14
0.12
0.10
0.08
0.06
0.04
0.02
∞
0 k
0 10 20 30 40 50 60 70 80
Z
1
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
e−( 2 ω +jzω)dω
−λ −2
P(X ≥ 1) = 1 − P(X = 0) = 1 − e = 1−e ≈ 0.865.
fZ (z) =
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
2π −∞
= 1 − e−2 (1 + 2) ≈ 0.594.
1 z2
− 2n
fZ (z) = √ e
2πn
De fato, a pdf de Z é Gaussiana de média zero e variância n.
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
2 2
com E{Xi} = mi e σX i
= σ i , forma-se a v.a. soma normalizada:
n
X1 + X2 + . . . + Xn 1 X
X, √ =√ Xi .
n n
i=1
0.14
0.12
0.10
0.08
0.06
0.04
O teorema do limite central diz que, sob certas condições gerais, 0.02
0
0 10 20 30
P(X ≥ 1) = 1 − P(X = 0) = 1 − e
40 50
−λ
60
= 1−e −2
70 80
k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
1 (x−m)2
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
−
= P({X ∈ B} ∩ {Y ∈ C}).
satisfeitas as condições:
0.14
0.12
0.10
0.08
0.06
0.04
0
0 10 20 30 40 50 60 70 80
k
1 6
Solution. Let X denote the number of hits. Then
1
−λ −2
P(X ≥ 1) = 1 − P(X = 0) = 1 − e = 1−e ≈ 0.865.
μ= 2.46 σ 1.52
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
800 1500
If X and Y are random variables, we use the shorthand
which is equal to
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
We also have
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
1500 600
= P({X ∈ B} ∩ {Y ∈ C}).
1500 1000
1000 400
1000
500
500 200
500
0 0
0 0 −5 0 5 −5 0 5 10
−5 0 5 0 5 4
x 10
15000 4000 2.5 Normal Probability
Weibull, A=B=1.3 Lognormal 0.999
Poisson, λ=2 0.997
μ=−2; σ=1 0.99
0.98
2 0.95
3000 0.90
10000 0.75
1.5 0.50
2000 0.25
1 0.10
0.05
5000 0.02
0.01
1000 0.003
0.5 0.001
0 0 0 −2 0 2 4 6 8
0 5 10 0 5 0 5 X
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Exemplo – Suponha que pedidos em um restaurante sejam modelados por Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
a) Estime a probabilidade de que os primeiros 100 clientes gastem um total de which is equal to
We also have
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
probabilidade dos primeiros 100 clientes gastarem um total de mais de 840 dolares:
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
840 − 800
2.3 Multiple random variables
If X and Y are random variables, we use the shorthand
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
20
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
Solução b) probabilidade dos primeiros 100 clientes gastarem um total entre 780
e 820 dolares.
Pr[780 ≤ SΣ ≤ 820] = Pr [−1 ≤ Z100 ≤ 1] = 1 − 2Q(1) ≈ 0.682
Solução c) O problema aqui é encontrar o valor de n para o qual:
Pr[Sn > 1000] = 0.9
1000 − 8n
Pr[Sn > 1000] = Pr Zn > √ = 0.9
2 n
Lembrando que Q(−x) = 1 − Q(x) e considerando que Q−1(0.9) = −1.2816 ou
(tabelado), resulta:
1000 − 8n
√ = −1.2815
2 n
√ √
a qual produz a equação quadrática em n: 8n − 2.573 · n − 1000 = 0
√
Raiz positiva resulta: n = 11.34 ou n = 128.6 ⇒ n = 129 clientes
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
exponencialmente distribuídas:
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
−ax −by
Putting all of our shorthand together, we can write
fX (x) = a · e fY (y) = b · e
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
Determine:
1. A PDF de Z = X + Y caso a = b;
2. A PDF de Z = X + Y caso a 6= b
Solução: Uma vez que o número de v.a. a serem somadas não é grande o
suficiente (a pdf Z NÃO será Gaussiana, conforme (48)), a PDF de Z = X + Y
é obtida a partir da convolução:
Z ∞ Z ∞
fZ (z) =fX ∗ fY = fX (z − y) · fY (y)dy = ab e−a(z−y) · e−by u(z − y)u(y)dy
Z z −∞ −∞
−az (a−b)y ab h
−az (a−b)y y=z
i
abe e dy · u(z) = e e y=0
u(z)
0 a − b
ab −by −az
= e −e u(z) válido para a 6= b.
a−b
Caso (a = b) resulta:
fZ (z) = a2z · e−az · u(z)
Taufik Abrão 73 Londrina, 3 de Maio de 2018
70 Introduction to discrete random variables
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
r n
12 X
Z= Xi
n
i=1
Note que a soma acima foi normalizada tal que Z tenha média zero e variância
unitáriaa. Determine:
1. o tipo de distribuição que Z assumirá caso n seja grande o suficiente;
2. a PDF de Z.
Sugestão:
utilize os seguintes fatos: Y = X1 + X2 + . . . + Xn e portanto fY (z) = fX1 (z) ∗
fX2 (z) ∗ · · · ∗ fXn (z)
Solução. Se Y = X1 +X2 +. . .+Xn , então fY (z) = fX1 (z)∗fX2 (z)∗· · ·∗fXn (z)
r r
n n
fZ (z) = fY
12 12
quando n cresce a pdf fz (z) ⇒ Gaussiana.
a 2 1
A variância de Xi é σX i
= 12
Taufik Abrão 74 Londrina, 3 de Maio de 2018
70 Introduction to discrete random variables
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
A qual pode ser calculada a partir da relação entre função densidade de Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
√ √ √
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
m
ω ω ω
My (ω) = Mx1 √ × Mx2 √ × . . . × Mxn √
n n n
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Exemplo – Um valor de tensão constante porém desconhecido é medido. Cada Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Similarly,
P(X ≥ 1) = 1 − P(X = 0) = 1 − e −λ
medida Xi pode ser modelada como sendo a soma do valor desejável de tensão v
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
We also have
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
Xi = v + Ni
Assuma que as tensões de ruído constituem v.a. independentes. Quantas medições
são necessárias para que tenhamos uma probabilidade maior ou igual
Pn a 0.99
de que
1
a esperança estatística da média de amostras, dada por E n j=1 Xi , esteja
dentro do intervalo = 1µV da média verdadeira?
Solução. Assumindo um número de medidas n suficientemente grande para que
o Teorema do limite central possa ser aplicado, tem a distribuição:
2 2
X ∼ N µ = v; σ = (1µ)
Normalizando: e,X−
X √
µ
∼ N [0; 1]
σ/ n
Evento: probabilidade maior ou igual a 0.99 de que a esperança estatística da
média de amostras, esteja dentro do intervalo = 1µV da média verdadeira
Pr[−b0.99 ≤ X
e ≤ b0.99] = 0.99
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
σ σ
−b0.99 √ ≤ X − µ ≤ b0.99 √
n n
ou ainda
σ σ
µ − √ b0.99 ≤ X ≤ µ + √ b0.99
n n
Finalmente:
σ 1µ
√ b0.99 ≤ ou √ 2.5758 ≤ 1µ ou n ≥ 7 amostras
n n
0.14
0.12
0.10
0.08
0.06
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
Probability
= 1 − e−2 (1 + 2) ≈ 0.594.
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
g(X) ≤ eα(X−δ),
= 1 − e−λ (1 + λ )
with α ≥ 0
= 1 − e−2 (1 + 2) ≈ 0.594.
We also have
40
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
eα(X−δ)
g(X)
1
δ X
Figura we
Therefore, 6: get
Function approximation for the Chernoff bound
the bound
≥ δ) = E{g(X)}
P (Xbound:
Therefore, we get the
≤ E{eα(X−δ)}
P (X ≥ δ) = E[g(X)]
α(X−δ)= e−α δ−αδ
E{eαX
≤ E[e ] = e · E}[eαX ], α ≥ 0 (50)
Taufik Abrão 79 Londrina, 3 de Maio de 2018
70 Introduction to discrete random variables
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
bound.
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
⇒ we optimize α:
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
d −αδ
e · E[eαX ] = 0
dα
⇒ The αopt can be obtained from
E[XeαoptX ] − δ E[eαoptX ]= 0
Solution to this equation gives the Chernoff bound
P (X ≥ δ) ≤ e−αoptδ · E[eαoptX ] (51)
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
We also have
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
α(X−δ) −αδ αX
We also have
P (X ≥ δ) ≤ E[e ] = e · E[e ], α ≥ 0
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
αopt
−1)−3αopt
P (X ≥ 3) ≤ eλ(e = e3−λ−3ln(3/λ)
2.3 Multiple random variables
If X and Y are random variables, we use the shorthand
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
U3 - Exercícios
0.14
0.12
0.10
0.08
0.06
U3 - Exercícios
0.04
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
x2
1
fX (x) = √ exp − 2
σ 2π 2σ
mostre que
n 1 · 3 · . . . · (n − 1)σ n. n par
E{X } =
0. n mpar
0.14
0.12
0.10
0.08
0.06
0.02
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
se a > 0 ou ρ = −1 se a < 0.
2.3 Multiple random variables
If X and Y are random variables, we use the shorthand
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
0.14
0.12
0.10
0.08
0.06
T
E{X} = [5 − 5 6]T
0.04
0 k
0 10 20 30 40 50 60 70 80
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
5 2 −1
Similarly,
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1)
= 1 − e−λ − λ e−λ
= 1 − e−λ (1 + λ )
= 1 − e−2 (1 + 2) ≈ 0.594.
−1 0 4
We also have
P(X ∈ B,Y ∈ C) := P({X ∈ B,Y ∈ C})
= P({X ∈ B} ∩ {Y ∈ C}).
a variância de
Y = AT X + b,
com A = [2 − 1 2]T e b = 5.
9. Dado duas r.v. X e Y , definindo uma nova r.v.:
2
X − µX Y − µY
Z = κ· +
σX σY
sendo κ uma constante real.
(a) Determine E[Z]
(b) Mostre que −1 ≤ ρXY ≤ 1, onde ρXY é o coeficente de correlação entre
X e Y.
10. Suponha que gastos de clientes em uma lanchonete sejam r.v. i.i.d com
média m = 8 Reais e desvio padrão and σ = 2 Reais. Estime a probabilidade
dos 100 primeiros clientes gastarem um total:
0.14
0.12
0.10
0.08
0.06
0.04
0
0 10 20 30
−λ
60
−2
70 80
k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
0.14
0.12
0.10
0.08
0.06
0.04
0
0 10 20 30
−λ
60
−2
70 80
k
Figure 2.5. The Poisson(λ ) pmf pX (k) = λ k e−λ /k! for λ = 10, 30, and 50 from left to right, respectively.
−|x|
{X ∈ B,Y ∈ C} := {ω ∈ Ω : X (ω ) ∈ B and Y (ω ) ∈ C},
e
which is equal to
{ω ∈ Ω : X (ω ) ∈ B} ∩ {ω ∈ Ω : Y (ω ) ∈ C}.
Putting all of our shorthand together, we can write
{X ∈ B,Y ∈ C} = {X ∈ B} ∩ {Y ∈ C}.
2
14. Soma de r.v. i.i.d.: Encontre a média e a variância para a soma de n
variáveis aleatórias independentes e identicamente distribuídas, cada uma
com média m e variância σ 2
15. Soma de r.v. i.i.d. Exponenciais. Encontre a PDF da soma de n r.v.
exponenciais, todas com parâmetro a.
16. Encontre o Chernoff bound para uma r.v. exponencial com λ = 1. Compare
o limitante com o valor exato para a Pr[X > 5].