Machine Learning 2: Exercise Sheet 6

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

Exercises for the course Abteilung Maschinelles Lernen

Institut für Softwaretechnik und theoretische Informatik


Machine Learning 2 Fakultät IV, Technische Universität Berlin
Summer semester 2021 Prof. Dr. Klaus-Robert Müller
Email: [email protected]

Exercise Sheet 6
Exercise 1: Markov Model Forward Problem (20 P)
A Markov Model can be seen as a joint distribution over states at each time step q1 , . . . , qT where qt ∈
{S1 , . . . , SN }, and where the probability distribution has the factored structure:
T
Y
P (q1 , . . . , qT ) = P (q1 ) · P (qt |qt−1 )
t=2

Factors are the probability of the initial state and conditional distributions at every time step.

(a) Show that the following relation holds:


N
X
P (qt+1 = Sj ) = P (qt = Si )P (qt+1 = Sj |qt = Si )
i=1

for t ∈ {1, . . . , T − 1} and j ∈ {1, . . . , N }.

Exercise 2: Hidden Markov Model Forward Problem (20 P)


A Hidden Markov Model (HMM) can be seen as a joint distribution over hidden states q1 , . . . , qT at each
time step and corresponding observation O1 , . . . , OT . Like for the Markov Model, we have qt ∈ {S1 , . . . , SN }.
The probability distribution of the HMM has the factored structure:
T
Y T
Y
P (q1 , . . . , qT , O1 , . . . , OT ) = P (q1 ) · P (qt |qt−1 ) · P (Ot |qt )
t=2 t=1

Factors are the probability of the initial state and conditional distributions at every time step.

(a) Show that the following relation holds:


N
X
P (O1 , . . . , Ot , Ot+1 , qt+1 = Sj ) = P (O1 , . . . , Ot , qt = Si )P (qt+1 = Sj |qt = Si )P (Ot+1 |qt+1 = Sj )
i=1

for t ∈ {1, . . . , T − 1} and j ∈ {1, . . . , N }.

Exercise 3: Programming (60 P)


Download the programming files on ISIS and follow the instructions.

You might also like