UNIT-3: 1. Explain The Terms Following Terms: (A) Mean (B) Mean Square Value. Ans

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

PTSPUNITIII Questions&Answers

GRIETECE 1
UNIT-3
1. Explain the terms following terms:
(a) Mean
(b) Mean square value.
Ans:
(a) Mean: 1st moment (Average value) (x or M,)
Let X be the random variable which assumes the values X
1
, X
2
....
Let the experiment is conducted n number of times.
Let x
1
occur n
1
times
x
2
occur n
2
times
x
3
occur n
3
times and so on.
The average value or mean value of random variable is given by

X =
n
1
x
1
+n
2
x
2
+
n

=x
1
[
n
1
n
+ x
2
[
n
2
n
+x
3
[
n
3
n

= x
1
P(x
1
) +x
2
P(x
2
) + x
3
P(x
3
)
X = x
|
P(x
|
)
n
|=1
For discrete random variables
Then for continuous random variable
X = M
1
= ] xP(x)

-
dx
(b)Mean Square value second method:
X
2
=
n
1
x
1
2
+ n
2
x
2
2
+
n

= x
1
2
[
n
1
n
+ x
2
2
[
n
2
n
+ x
3
2
[
n
3
n

= x
1
2
P(x
1
) +x
2
2
P(x
2
) +x
3
2
P(x
3
)
=
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 2
=X = x
|
2
P(x
|
)
n
|=1
For discrete random variable
X
2
= ] x
2
P(x)dx

-
For continuous random variable
The nth moment X
n
= ] x
n
P(x)dx

-


2. Explain different types of inequalities?
Ans: There are three types of inequalities as discussed below:
(a) Chebhychevs Inequality: A useful tool in some probability problems is Chebychev's
inequality. For a random variable X with mean value x and variance o
X
2
. H st ates that
P||X -X

| e|
o
X
2
e
for any e > 0
Markov's Inequal ity: Another inequality that is useful in probability problems is Markov's
inequality, which applies to a nonnegative random variable X it is
P|X a|
F|X]
a
a>0
The restriction to nonnegative random variables is relived by Chernoff's inequality.
(c) Chernoff's Inequality and Bound: Let us try to understand Chernoff's inequality
through an example.
Let X is any random variable, nonnegative or not. For .any real v > 0 it is clear.
From some sketches that
exp [v(x-a)] u(x-a). (1)
Where u ( ) is the step function and 'a' is an arbitrary real constant.
Since P|X a| = ]
X
(x)dx
a
-

=]
X

-
(x)u(x-a) dx.(2)
We know that moment generating function defined by
M
X
(V) =E[e


VX
]
Where is real number - < v <

ThusM
X
(V) =]
X
(x)

-
e
ux
dx(3)
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 3
From (1),(2) and (3) we can write
P(Xa) =]
X
(x)

-
e
u(x-a)
dx
= c
-u
M
X
(V)

The above equation (4) is called Chernoffs inequality because the right side is a function of
parameter v it can be minimized with respect to their parameter. The minimum value is called
Chernoff's bound.

3. Show that
(a) var (cx) =C
2
var(x)
(b) var (X+Y) =var(X) +var(Y)
(c) var (X-Y) = var(X) + var(Y)
Ans:
(a) var (cx) =c
2
var[x]
= E [(CX-CX

)
2
]
= E [(CX)
2
-CX

2
]
= E [(CX)
2
-CX

2
]
=
C
2
E [(X-M)^2 ] where X

= H

(b) var (X+Y)=var(X)+var(Y)
var (X+Y) = E[[(X+Y)-(X +

)]
2
]
= E[[(X+Y)-(H
X
+H

)]
2
]
= E[(X - H
X
)
2
+(Y -H

)
2
+2 E[(X - H
X
)E (Y -H

)
= var (X)+var(Y)+2*0*0 [since E[(X - H
X
)= E(X)- H
X
= H
X
- H
X
=0]
Therefore var (X+Y)=var(X)+var(Y)
(c) var (X-Y)=var(X)+var(Y)
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 4
Var[(X)+(-Y)]
var(X)+ var((-1)Y)
var(X)+(-1)
2
var(Y)

Therefore var (X-Y) = var(X)+var(Y)

4. Explain about moment generating function and properties of moment
generating function?
Ans: Moment generating function of a random variable X is defined as
M
X
(t) = E[e
tx
]
If X is a discrete random variable
M
X
(t) = c
tx
i
P(x


If X is a continuous random variable,
M
X
(t) =] c
tx

-
(x)Jx
To justify the name moment generating function, consider
M
X
(t) = E[e
tx
]
= E_1 +tx +
t
2
x
2
2!
+
t
S
x
S
S!
+. . _
=1 + t E(x) +
t
2
2!
E(x
2
) +
t
3
3!
E(x
3
) + .
This expansion is a polynomial in t whose coefficients are the moments of the random variable.
Thus, moment generating function gives the moments of random variable, but only moments
about the origin.
Consider
d
dt
H
x
(t)
t=0
=
J
Jt
j1 +t E(x) +t
2
E(x
2
) +t
S
E(x
S
) +[
t=0


= E (X) = m
1
( First moment)
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 5
Similarly
d
dt
2
H
X
(t)
t=0
= E (X
2
) = m
2
(second moment and so on),
Thus, the nth moment of the random variable can be obtained as
m
n
=
d
n
dt
n
H
X
(t)
t=0

PROPERTIES OF MOMENT GENERATING FUNCTION:
1. Let X be a random variable with moment generating function M
X
(t) .Then the moment
generating function of Y=aX+b
M
Y
(t) = E[e
Yt
]
= E[e
(aX+b)t
]
=E[e
aXt
e
bt
]
= e
bt
E[e
x(at)
]
= e
bt
M
X
(at)
2. If M
Y
(t) is the moment generating function of the random variable X,then ,the MGF of
Y=K.X is
M
Y
(t) =E[e
ty
]
= E[e
t(K.X)
]
= E[e
x .(Kt)
]
= M
X
(Kt)
3.If If M
Y
(t) is the moment generating function of the random variable X,then ,the MGF of
Y=
X+u
b
is
M
Y
(t) =E[e
ty
] =E[e
t[
X+u
b
]
]
= E[c
tX
b
.c
ct
b
]
= c
ct
b
.M
X
(t/b)
4. If two random variables X and Y having MGFs as M
Y
(t) and M
Y
(t) such that M
Y
(t) = M
Y
(t) ,
Then X and Y are said to have identical distribution i.e., of identical density
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 6
5. Explain briefly about characteristic function and also about its properties?
Ans: The characteristic of a random variable X is defined as

X
() = E[e
jx
]
For a continuous random variable X,

X
() =] c
]ox
. (x)Jx

-

And for a discrete random variable,

X
() = c
]ox

p(x

)
Consider |1
X
()| = |] c
]ox

-
(x)Jx| ] |c
]ox
. (x)|

-
Jx ] |c
]ox
|

-
|(x)|Jx
Since |c
]ox
| = 1
|1
X
()| ]
|(x) |

-
Jx =1
Therefore |1
X
()| 1
Thus, the characteristics function of a random variable is always convergent i.e., of finite
magnitude. To generate moments of characteristic function consider

X
() = E[e
jx
]
= E _1 + ]x +
(]ox)
2
2!
+
(]nx)
3
2!
3!
+. _
= 1 +] E(x) +
(]o)
2
2!
E(x
2
) +.
This expansion is the polynomial in j, where the coefficient of each term involving j is the
moment of thee random variable i.e., coefficient of j is the 1
st
moment m
1
, coefficient of

(j)
2
2!

is 2nd moment m
2
and so on. Characteristic function is used to find the moments of random
variables about origin only as follows
Consider
d
do
|1
X
]|
o=0
=
d
do
j1 +]. E(x) +
(]o)
2
2!
E(x
2
) +. [
Therefore E(x) = 1/J
d
do
|1
X
]|
o=0

In general the nth moment of a random variable X can be obtained as
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 7
E(x
n
) = (1/J)
n

d
n
do
n
|1
X
]|
o=0

PROPERTIES OF CHARACTERSTIC FUNCTION
Let 1
X
() be the characteristic function of the random variable X
1. |1
X
]|
mux
is at = 0 and the maximum value is 1.
With reference to the convergence of characteristic function,
|1
X
| = |] c
]ox
(x)Jx

-
| 1
At = 0, |1
X
| = |] (x)Jx

-
| = 1
Therefore |1
X
| is maximum and is equal to 1 at = 0.
2. Characteristic function of random variable Y = aX + b is
1

() = E[e
jY
]
= E[e
j (aX+b)
]
=E[e
jaX
e
bj
]
= e
bj
E[e
x(a)j
]
= e
bj
1
X
(o)
3. Characteristic function of Y=
X+u
b
,where a and b( 0) are constants
1

() = E[e
jY
] = E[e
j[
X+u
b
]
]
= e
j(a/b)
E[e
j(/b)x
]
= e
j(a/b)
1

(b)
4.If the characteristic function of two random variables are identical ,then the random variables
are said to be of identical distribution ,i.e., they are of identical density .
5 .Consider the expression for Fourier transform of a function f(t) ,i.e.,
F() =] c
]ot
(t)Jt

-
,Here t is time variable
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 8
The characteristic function of random variable X is given as
1
X
= ] c
]ox
(x)Jx

-

Comparing the expressions for 1
X
and F() ,it can be concluded that, except for the sign of
the exponent ,both expression are found to be similar.
The expression for inverse Fourier transformation is
(x) =
1
2
] c
]ox

-
0
X
(). J
Since f(t) F(),it implies that
f (x) 0
X
()
i.e., for a random variable, it is probability density function and characteristic function form a
Fourier Transformation pair.
6. Explain in detail about uniform random variable?
Ans:
The density of random variable X over (a-b) is
f (x) =1/(b-a) for a x b
= 0 for else where
The corresponding CDF is
F
X
(x) = 0 for x < a
=
x-u
b-u
for a x b
= 1 for x > b
MEAN OFA UNIFORM RANDOM VARIABLE
E(X) = ] x (x)Jx
b
u
= ] x .
1
b-u
Jx
b
u
=
1
b-u
j
x
2
2
[
u
b
=
b+u
2

MEAN SQUARE VALUE OF A UNIFORM RANDOM VARIABLE
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 9
E(X
2
) = ] x
2
b
u
(x)Jx =
1
b-u
j
x
3
3
[
u
b
=
b
3
-u
3
3(b-u)
=
b
2
+ub+u
2
3

VARIANCE OF UNIFORM RADOM VARIABLE
Var(x) = E(X
2
) [E(X)]
2

=
b
2
+ub+u
2
3

(b+u)
2
4
=
(u-b)
2
12

MOMENT GENERATING FUNCTION OF UNIFORM RANDOM
VARIABLE
M
X
(t) =E [e
tx
] = ] c
tx
. (x)Jx =
b
u

1
b-u
] c
tx
b
u
Jx
=
1
b-u
j
c
tx
t
[
u
b

M
X
(t) =
c
bt
-c
ct
(b-u)t

CHARACTERSTIC FUNCTION OF UNIFORM RANDOM VARIABLE

X
() = N
X
(t)
t=]o

=
c
bt
-c
ct
(b-u)t

t=]o

=
c
]bn
-c
]cn
(b-u)]o

7. A random variable X is defined by
X = -2 with a prob .1/3
= 3 with a prob.1/2
= 1 with a prob .1/6
Find the variance of Y = 2X+5
Sol: When X = -2 then Y = 1
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 10
X = 3 then Y = 11
X =1 then Y = 7
The probability distribution of Y is
Y
j
1 7 11
P(Y
j
) 1/3 1/6 1/2
E(X) = (1*1/3) + (7 * 1/6) + (11 * 1/2) = 7
Var(X) = (x

- m)
2

. p(x

)
= (1-7)
2
* (1/3) + (7-7)
2
*(1/6) + (11-7)
2
*(1/2)
= 20
E(X
2
) =1*(1/3) + 49*(1/6) + 121 * (1/2)
= 414/6
8. Find the moment generating function of a random variable X with pdf
f(x) = e
-x
for x 0
Find the first two moments about origin.
Ans: M(t) = E[e
tx
] = ] c
tx

0
(x) Jx= ] c
tx

0
c
-x
Jx
= ] c
-(1-t)x
Jx

0

=
c
-(1-t)x
(1-t)

0

=
1
(1-t)

1st moment m
1
=
d
dt
H
X
(t)
t=0
=
d
dt
j
1
(1-t)
[
t=0
=
1
(1-t)
2

t=0
=1
2
nd
moment m
2
=
d
2
dt
2
H
X
(t)
t=0
=
d
dt
j
1
(1-t)
2
[
t=0
=
2
(1-t)
3

t=0
= 2
9.A Gaussian random variable X with mean = 0.6 and standard deviation =
0.8 is transformed to a new random variable by the transformation .
Y =T(x) = 4 for 1.0 X <
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 11
Y =T(x) = 2 for 0 X < 1.0
= -2 for -1.0 X < 0
= - 4 for - X < -1.0
Sol: P(1.0 X < )
Consider the standard random variable
Z =
X-m
c
i.e Z=
1-0.6
0.8
= 0.5
P(1.0 X < ) = P(X 1) = P(Z 0.5) =0.5 P(0 Z < 0.5)
= 0.5 -0.1915 = 0.3085

P(0 X < 1.0) :
This is same as 0.5 P(X 1) we have P(X 1) = 0.3085 (From previous computation)
Therefore P(0 X < 1.0) =0.5 -0.3085 = 0.1915

P(-1.0 X < 0):
Due to symmetry ,this is same as P(1.0 X < ) =0.3085)
Therefore the probability of Y is
Y = y
j
-4 -2 2 4
P(Y=y
j
) 0.3085 0.1915 0.1915 0.3085
E(Y) = (-4)*0.3085 +(-2)*0.1915+(2)*0.1915+(4)*0.3085
= 0
E(Y
2
) =11.404
Var(Y) = E(Y
2
) [E(Y)]
2

= 11.404t
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 12
9. X it is a Gaussian variable N (m,
2
),Find the upper bound for P(X
K),where K is a constant ?
Ans:
This can be evaluated by using chernoffs bound stated as
P(X K) e
-Kt
.M
X
(t) is the MGF of X.
The highest bound occurs by minimizing the highest level of inequality with respect to t.
The MGF of Gaussian random variable is
M
X
(t) = e
mt +
c
2
t
2
2

Therefore P(X K) e
-kt

.
c
_mt +
o
2
t
2
2
_
= e
-(K-m)t
+
c
2
t
2
2

Therefore
d
dt
|e
-(K-m)t
+
c
2
t
2
2
] = 0
=> e
-(K-m)t
+
c
2
t
2
2
|-(K -m)t +o
2
t] = 0
=> -(K-m) +
2
t = 0
=> t =
(K-m)
c
2

Minimum of e
-(K-m)t
+
c
2
t
2
2

e
-(K-m)t
+
c
2
t
2
2

t=
|K-m]
o
2
=c
-(k-m)
2
2o
2

Therefore Chernoffs bound is
P(X K) c
-(K-m)
2
2o
2

10.A random variable X has the probabaility ,given as
P(X= 0) = P(X =2)=P and P(X=1)=1-P
and 0 P ,for what value of P,variance of X is maximum?
www.jntuworld.com
www.jntuworld.com
PTSPUNITIII Questions&Answers
GRIETECE 13
Ans:
E(X) = (1-p) + 2p =1+p
E(X
2
) = (1-p) + 4p =1+3p
Var(X) = E(X
2
) [E(X)]
2

= 1+3p (1+p)
2

= -p
2
+p
Consider
d
dp
:or(X) = u

d
dp
(-p
2
+p) = 0
2p+1 =0
p =1/2



www.jntuworld.com
www.jntuworld.com

You might also like