Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Pattern Recognition and Classification Homework 4: K-Nearest Neighbor and Batch Perceptron, Assignments of Pattern Classification and Recognition

Homework solution reference for CPE646

Typology: Assignments

2019/2020

Uploaded on 10/15/2020

little_cute
little_cute 🇺🇸

5

(1)

5 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Stevens Institute of Technology
Department of Electrical and Computer Engineering
CpE 646 Pattern Recognition and Classification
Homework 4
Problem 1: Once again we use the data setshw3_2_1” and “hw3_2_2” from
Homework 3. The sample vectors in “hw3_2_1” are from class ω1 and sample vectors in
hw3_2_2” are from class ω2.
Use k-nearest neighbor method to estimate the class conditional density functions p(x|ω1)
and p(x|ω2) for every x in {-4:0.1:8, -4:0.1:8}; use “mesh” function in Matlab to plot the
results; and then classify x=[1,-2]t based on the estimation. Let k=10.
(Hint: The “sort” function in Matlab can be used to find the closest neighbors.)
clear all
clc
load ('hw3.mat')
label1=ones(1,100);
label2=1+label1;
label=[label1 label2];
data1=hw3_2_1;
data2=hw3_2_2;
[r,c]=size(data1);
train_data=[data1 data2];
P1=zeros(121,121);
P2=zeros(121,121);
K=10;
w=0;
for i=-4:0.1:8
w=w+1;h=0;
for j=-4:0.1:8
h=h+1;
ki=0;kj=0;
test_data=[i;j];
% calculate Euclid Distance
distance=zeros(1,200);
for x = 1:200
a=i-train_data(1,x);
c=j-train_data(2,x);
distance(x)=sqrt(a.^2+c^2);
end
pf3
pf4
pf5

Partial preview of the text

Download Pattern Recognition and Classification Homework 4: K-Nearest Neighbor and Batch Perceptron and more Assignments Pattern Classification and Recognition in PDF only on Docsity!

Stevens Institute of Technology

Department of Electrical and Computer Engineering

CpE 646 Pattern Recognition and Classification

Homework 4

Problem 1: Once again we use the data set shw3_2_1 ” and “ hw3_2_2 ” from

Homework 3. The sample vectors in “ hw3_2_1 ” are from class ω 1 and sample vectors in

hw3_2_2 ” are from class ω 2

Use k-nearest neighbor method to estimate the class conditional density functions p ( x | ω 1 )

and p ( x | ω 2 ) for every x in {-4:0.1:8, -4:0.1:8}; use “ mesh ” function in Matlab to plot the

results; and then classify x =[1,-2]

t

based on the estimation. Let k =10.

( Hint: The “sort” function in Matlab can be used to find the closest neighbors.)

clear all

clc

load ('hw3.mat')

label1=ones(1,100);

label2=1+label1;

label=[label1 label2];

data1=hw3_2_1;

data2=hw3_2_2;

[r,c]=size(data1);

train_data=[data1 data2];

P1=zeros(121,121);

P2=zeros(121,121);

K=10;

w=0;

for i=-4:0.1:

w=w+1;h=0;

for j=-4:0.1:

h=h+1;

ki=0;kj=0;

test_data=[i;j];

% calculate Euclid Distance

distance=zeros(1,200);

for x = 1:

a=i-train_data(1,x);

c=j-train_data(2,x);

distance(x)=sqrt(a.^2+c^2);

end

% fint the closet neighbors and choose K number

[dist,neighbors]=sort(distance);

dist =dist(1:K);

neighbors=neighbors(1:K);

% find the lable(class)

for k=1:K

if label(neighbors(k))==

ki=ki+1;

else kj=kj+1;

end

end

P1(w,h)=ki/K;

P2(w,h)=kj/K;

if i==1&&j==-

p1=P1(w,h)

p2=P2(w,h)

end

end

end

b=[-4:0.1:8];

mesh(b,b,P1)

hold on

mesh(b,b,P2)

2.2 Assume a  function, which projects each input vector

1

1

2

2

1 2

to ˆ

x

x

x

x

x x

x x

plot

x ˆ in 3-D use Matlab function “plot3”.

clear all

clc

load ('hw4.mat')

data1=hw4_2_1;

data2=hw4_2_2;

data11=[data1;(data1(1,:)).*data1(2,:)];

data22=[data2;(data2(1,:)).*data2(2,:)];

plot3(data11(1,:),data11(2,:),data11(3,:),'*',data22(1,:),data22(2,:,:),data22(3,:),'o')

title('data set');

2.3 Now define an augmented vector as

1

2

1 2

x

x

x x

y x

. Use the Batch Perceptron

method (page 35 and 39, CPE646-9) to find the weight vector

0

1

2

3

a

a

a

a

a

in the

generalized linear discriminant function (page 22, 23. CPE646-9).

(Hint: let =1, =1, initialize

a y

clear all

clc

load ('hw4.mat')

data1=hw4_2_1;

data2=hw4_2_2;

%normalized

n=ones(1,100);

y1=[n;data1;(data1(1,:)).*data1(2,:)];