


Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Homework solution reference for CPE646
Typology: Assignments
1 / 4
This page cannot be seen from the preview
Don't miss anything!
Homework 2 solutions
Problem 1.
P1.1 Maximum likelihood estimation
2
otherwise
x
xe x
p x
2
1 1
1 1 1 2 1 2
2
1
2
1
( ) ln ( | ) ln ( | ) ln ( | )
ln ( | )
ln
ln
we have
f
ln
or and
k
n n
k k
k k
n k k n k n k n k n k n k
x
k
k k
x e
x
l p D p x p x
l p x
x
n
x
x x
n
x
P1.2 Bayesian estimation. Given
We estimate
( ) ~ ( , ) , 0 and fixed
otherwise
p U
2
1
1 1
let ( | ) ( ) = , which is a normalization factor independent of
we try to find whi
k
x
n
k
k
n n
k
k k
k
p x p
p D p
p D
p D p d p D p d
p D p d
p D p x p x e
1 1 2 1 2 1
2
2
2
ch maximizes ( | ) as our estimate of
ln ( | ) ln ln ln ln
ln
ln ( | )
ln ln
we have
for and 0<
because ln ln ln isu
k k
k k
k
n k n k n k n k k
x x
x x
p D
p D
p
x
x
x x
n
x
nimodal and increases before maxima
so if > we let =
We can see that if we use y k
for classification, which is the result of LDA for dimension
reduction, the classification performance will be good.
1
2
1 2
1 1
m = ,m =
i
i
W
t
i i
x D
x m
x m
1
1
1 2
within-class scatter matrix,
W
t
k k
w S m m
y w x
the reconstructed da
i
t s
a
k k
x y w