Homework4 Supriya 12040810

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

DS250 : HOMEWORK 4

NAME : M . V.D. SUPRIYA

ID NO : 12040810
February 21, 2023

1
problem 1
(a) Equationof signalplusnoiseisgiventobext − st + wt
where (
o t = 1, 2, .....100
st = − t−100 2πt
10e ( 200 )cos( 4 ) t = 101, ......200

mean is the expected value of xt = st + wt

[xt ] = E[st ] + E[wt ]

As it is signal - plus -noise equation so it is a fixed equation of time, so

µt = st + 0 [mean for white noise is zero]

µt = st
(
0 t = 1, 2, .......100
µt = − t−100 2πt
10e ( 200 )cos( 4 ) t = 101, .....200

1
(b)Calculation of auto covarience :
gamma(s, t) = E[(xs − µs )(xt − µt )]
but since
xs = ss + ws ,xt = st + wt ,µs = ss and µt = sst
we can rewrite it as :

γ(s, t) = E[(ss + ws − ss )(st + wt − st )] = E[ws , wt ]




 1 s=t

t = 1, 2, .....200
γs, t =


 0 s ̸=
t = 101, ......200

2
problem 2
Calculation of Autocovariance:
γ(s, t) - cov(xs , xt ) = cov[(ws−1 +2ws +ws+1 (wt−1 +2wt + wt+1 ]
for s = t i.e, | h |= 0 :
γ(t, t) − cov(xt , xt ) = cov[(wt−1 + 2wt + wt+1 (wt−1 + 2wt + wt+1 ]
= cov[(wt−1 , wt−1 ) + 4cov(wt , wt ) + cov[(wt+1 , wt+1 )]
= σ 2 w + 4σ 2 w + σ 2 w
2
= 4σw
for s = t+1 ie | h |= 1
γ(t+1 , t) − cov(xt+1 , xt ) = cov[(wt + 2wt+1 + wt+2 (wt−1 + 2wt + wt+1 ]
= 2cov[(wt , wt + 2cov(wt+1 , wt+1 )
= 2σ 2 w + 2σ 2 w
= 4σ 2 w
for s = t+2 ie | h |= 2 :
γ(t+2 , t) − cov(xt+2 , xt ) = cov[(wt+1 + 2wt+2 + wt+3 (wt−1 + 2wt + wt+1 ]
= cov(wt+1 , wt+1 )


 1 | h |= 0
 2 | h |= 1

2
= σw ρ(s, t) = 31
 | h |= 2
6


o | h |> 2
It is 0 when —h— ¿2 as there will be no common terms for covarience
Calculation of Autocorrelation :

γ(s, t) γ(h
ρ(s, t) = p =
γ(s, s)γ(s, t) γ(0)
using the covarience values from above:

3


1 | h |= 0
2

| h |= 1
ρ(s, t) = 3
1
 | h |= 2
6


o | h |> 2

problem 3
solution:

4
Given that U1 andU 2 areindependentrandomvariableswithinzeromeansand
E(U21 ) − E(U22 ) = σ 2
then mean of xt is :
E[xt ] = E[U1 sin(2πwo t] + U2 cos(2πwo t)]
= E[U1 sin(2πwo t] + E(U2 cos(2πwo t)]
=0+0=0
calculation of auto covarience :
γ(xt , h, xt ) = cov(xt+h , xt )γ(xt+h , xt ) =
cov[U1 sin(2πwo (t + h)) + [U2 (cos2πwo (t + h)), [U1 (sin2πwo t) + U2 cos(2πwo t)]
γ(h) =
cov[U1 sin(2πwo (t + h))U1 (sin2πwo t) + cov[U2 (cos2πwo (t + h))U2 cos(2πwo t)]
= sin2(πwo (t + h))sin2(Πwo t)σ 2 + cos2(πwo (t + h))cos2(Πwo t)σ 2
= σ 2 (cos(2πwo (t + h − t))[cosacosb + sinasinb − cos(a − b)]
= σ 2 (cos(2πwo (h)))
as the mean of the series is zero and its auto covarience function depends only
on the time lag h
we have shown that series is weakly stationary with an auto covarience
function:
γ(h) = σ 2 (cos(2πwo (h)))
problem 4
The predicted value of xt+1 can be written as xt+1 = Axt
The mean - square prediction error is then

M SE(A) = E[(xt+1 − Axt )2 ] (1)


To minimize the predicted error we take its first order derivative wrt A and
make it zero
dM SE
= −2E[(xt+1 − Axt )xt ] (2)
dA
0 = 2γ(t) − 2Aγ(0) (3)

γ(t)
A= (4)
γ(0)

A = ρ(t) (5)
(b) Now putting the value of A = ρ(t)inpredictederrorterm :
MSE(A) = E[(xt+1 − ρ(t)xt )2 ]
= E[x2t +1 − 2ρ(t)xt xt+1 + ρ2 (t)x2t ]
= γ(0) − 2ρ(t)γ(t) + ρ2 (t)γ(0)
= γ(0)[1 − ρ2 (t)]
(c) we have s[ t+1 ] = Axt that means the predicted term gives us the actual
term in the series hence predicted error will be equal to zero M SE(A) = 0 s0

5
M SE(A) = γ(0)[1 − ρ2 (t)]
0 = γ(0)[−ρ2 (t)] (6)
ρ(t) = ±1

As it is the minima case hence proof derived from (a) hold true i.e,

A = ρ(t)
so we have 2 conditions
For
A > ρ(t) = 1
For
A < ρ(t) = 1

problem 5
(a)
Mean function is given by

E(xt ) = E(µ + wt + θwt−1 )


= µ + E[wt ] + θE[wt−1 ]
(7)
=µ+0+0

Thus E(xt ) = µ
(b)
ANS:Auto covarience function is given as

γx (h) = cov(xt , xt+h )


= cov(µ + wt + θwt−1 .µ + wt+h + θwt+h1
= cov(µ, µ) + cov(wt , wt+h ) + θcov(wt , wt+h−1 ) + θcov(wt−1 , wt+h + θcov(wt−1 , wt+h−1 )
= σ 2 δ(h) + θσ 2 2δ(h − 1) + θσ 2 δ(h + 1)
(8)

where δ(h) is kronecker delta function which is 1 if h = 0 h ̸= 0 so

2
γx (0) = σw
2 (9)
γx (±1) = θσw
γx (h) = 0 f or any other

Hence proved
(c)

6
ANS:
For a stationary series the mean does not depend on time and auto covarience
function only depends on lag means E(xt = µ) is constant throughout and
doesnot depend on time . for auto covarience function function we have
proved that in part (b) and it is only depends on lag time and not on time
thus series xt is stationary for all θ inR
(d)
ANS:
Sample mean x̄ can be calculated as

n
1 X
var(x̄) = var( )
n i=1
n
1 X
= var(xi
n2 i=1 (10)
1 2
= (nσw (1 + θ2 ) + 2(n − 1)σw
2 2
θ )
n2
σ2 1
= w (1 + θ2 (2 − )
n n
2σ 2
(i) if θ = 1, then [var](x¯[]) = nw
σ2
(ii)if θ = 0, then [var](x¯[]) = nw
(iii)if θ = −1, then [var](x¯[]) = 0
(E)ANS:
As the sample size of n becomes large ,the term n−1 n approaches 1 so the
variance of x̄ becomes smaller , this means that the accuracy in the estimate
of the mean improves as the sample size increases.
for the 3 different cases the accuracy of the estimate is expected to be highest
when θ = 1 followed by θ = 0 and lowest when θ = −1 this is because the
coeffcient θ represents the dependence of t on the previous time point and
when θ is large and negative the process becomes highly dependent on the
previous value making it harder to estimate the mean
6)
ANS
There is no oil and gas data

You might also like