Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability & Stochastic Processes for Engineers: Joint Probability & Expectations, Study notes of Engineering

Solutions to various problems related to joint probability mass functions and expectations in the context of probability and stochastic processes. The problems involve finding probabilities, expected values, and variances of random variables x and y, as well as calculating correlations and covariances.

Typology: Study notes

Pre 2010

Uploaded on 08/19/2009

koofers-user-71r
koofers-user-71r 🇺🇸

10 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Probability and Stochastic Processes:
A Friendly Introduction for Electrical and Computer Engineers
Roy D. Yates and David J. Goodman
Problem Solutions : Yates and Goodman,3.1.2 3.3.2 3.4.2 3.5.2 3.6.2 3.6.3 and 3.7.3
Problem 3.1.2
On the X,Yplane, the joint PMF is
y
x
PX,Y(x,y)
3c
2c
c
c
c
c
2c
3c
1 2
1
(a) To find c, we sum the PMFover all possible values of Xand Y. We choose cso the sum equals
one.
x
yPX,Y(x,y) =
x=2,0,2
y=1,0,1c|x+y|=6c+2c+6c=14c
Thus c=1/14.
(b)
P[Y<X] = PX,Y(0,1)+ PX,Y(2,1)+ PX,Y(2,0) + PX,Y(2,1)
=c+c+2c+3c=7c=1/2
(c)
P[Y>X] = PX,Y(2,1) + PX,Y(2,0)+ PX,Y(2,1) + PX,Y(0,1)
=3c+2c+c+c=7c=1/2
(d) From the sketch of PX,Y(x,y)given above, P[X=Y] = 0.
(e)
P[X<1] = PX,Y(2,1) + PX,Y(2,0) + PX,Y(2,1) + PX,Y(0,1) + PX,Y(0,1)
=8c=8/14
1
pf3
pf4
pf5

Partial preview of the text

Download Probability & Stochastic Processes for Engineers: Joint Probability & Expectations and more Study notes Engineering in PDF only on Docsity!

Probability and Stochastic Processes:

A Friendly Introduction for Electrical and Computer Engineers

Roy D. Yates and David J. Goodman

Problem Solutions : Yates and Goodman,3.1.2 3.3.2 3.4.2 3.5.2 3.6.2 3.6.3 and 3.7.

Problem 3.1. On the X , Y plane, the joint PMF is







y

x

PX , Y ( x , y )

3 c

  • 2 c
  • c

c

  • c

c

  • 2 c
  • 3 c

1 2

1

(a) To find c , we sum the PMF over all possible values of X and Y. We choose c so the sum equals one.

x

y

PX , Y ( x , y ) = ∑

x =− 2 , 0 , 2

y =− 1 , 0 , 1

c | x + y | = 6 c + 2 c + 6 c = 14 c

Thus c = 1 /14.

(b)

P [ Y < X ] = PX , Y ( 0 , − 1 ) + PX , Y ( 2 , − 1 ) + PX , Y ( 2 , 0 ) + PX , Y ( 2 , 1 ) = c + c + 2 c + 3 c = 7 c = 1 / 2

(c)

P [ Y > X ] = PX , Y (− 2 , − 1 ) + PX , Y (− 2 , 0 ) + PX , Y (− 2 , 1 ) + PX , Y ( 0 , 1 ) = 3 c + 2 c + c + c = 7 c = 1 / 2

(d) From the sketch of PX , Y ( x , y ) given above, P [ X = Y ] = 0.

(e)

P [ X < 1 ] = PX , Y (− 2 , − 1 ) + PX , Y (− 2 , 0 ) + PX , Y (− 2 , 1 ) + PX , Y ( 0 , − 1 ) + PX , Y ( 0 , 1 ) = 8 c = 8 / 14

Problem 3.3. On the X , Y plane, the joint PMF is







y

x

PX , Y ( x , y )

3 c

  • 2 c
  • c

c

  • c

c

  • 2 c
  • 3 c

1 2

1

(a) To find c , we sum the PMF over all possible values of X and Y. We choose c so the sum equals one.

x

y

PX , Y ( x , y ) = ∑

x =− 2 , 0 , 2

y =− 1 , 0 , 1

c | x + y | = 6 c + 2 c + 6 c = 14 c

Thus c = 1 /14. (b) P [ Y < X ] = PX , Y ( 0 , − 1 ) + PX , Y ( 2 , − 1 ) + PX , Y ( 2 , 0 ) + PX , Y ( 2 , 1 ) = c + c + 2 c + 3 c = 7 c = 1 / 2

(c) P [ Y > X ] = PX , Y (− 2 , − 1 ) + PX , Y (− 2 , 0 ) + PX , Y (− 2 , 1 ) + PX , Y ( 0 , 1 ) = 3 c + 2 c + c + c = 7 c = 1 / 2

(d) From the sketch of PX , Y ( x , y ) given above, P [ X = Y ] = 0. (e) P [ X < 1 ] = PX , Y (− 2 , − 1 ) + PX , Y (− 2 , 0 ) + PX , Y (− 2 , 1 ) + PX , Y ( 0 , − 1 ) + PX , Y ( 0 , 1 ) = 8 c = 8 / 14

Problem 3.4. In Problem 3.2.2, we found that the joint PMF of X and Y was







y

x

PX , Y ( x , y )

3 / 14

2 / 14

• 1 /^14

1 / 14

• 1 /^14

1 / 14

2 / 14

• 3 /^14

1 2

1

Problem 3.6. We can make a table of the possible outcomes and the corresponding values of W and Y outcome P [·] W Y hh p^2 0 ht p ( 1 − p ) 1 1 th p ( 1 − p ) − 1 1 tt ( 1 − p )^2 0 In the following table, we write the joint PMF PW , Y ( w , y ) along with the marginal PMFs PY ( y ) and PW ( w ). PW , Y ( w , y ) w = − 1 w = 0 w = 1 PY ( y ) y = 0 0 ( 1 − p )^2 0 ( 1 − p )^2 y = 1 p ( 1 − p ) 0 p ( 1 − p ) 2 p ( 1 − p ) y = 2 0 p^2 0 p^2 PW ( w ) p ( 1 − p ) 1 − 2 p + 2 p^2 p ( 1 − p ) Using the definition PW | Y ( w | y ) = PW , Y ( w , y ) / PY ( y ), we can find the conditional PMFs of W given Y.

PW | Y ( w | 0 ) =

1 w = 0 0 otherwise PW | Y ( w | 1 ) =

1 / 2 w = − 1 , 1 0 otherwise

PW | Y ( w | 2 ) =

1 w = 0 0 otherwise Similarly, the conditional PMFs of Y given W are

PY | W ( y | − 1 ) =

1 y = 1 0 otherwise PY | W ( y | 0 ) =

( 1 − p )^2 1 − 2 p + 2 p^2 y^ =^0 p^2 1 − 2 p + 2 p^2 y^ =^2 0 otherwise

PY | W ( y | 1 ) =

1 y = 1 0 otherwise

Problem 3.6. (a) First we observe that A takes on the values SA = {− 1 , 1 } while B takes on values from SB = { 0 , 1 }. To construct a table describing PA , B ( a , b ) we build a table for all possible values of pairs ( A , B ). The general form of the entries is PA , B ( a , b ) b = 0 b = 1 a = − 1 PB | A ( 0 | − 1 ) PA (− 1 ) PB | A ( 1 | − 1 ) PA (− 1 ) a = 1 PB | A ( 0 | 1 ) PA ( 1 ) PB | A ( 1 | 1 ) PA ( 1 ) Now we fill in the entries using the conditional PMFs PB | A ( b | a ) and the marginal PMF PA ( a ). This yields PA , B ( a , b ) b = 0 b = 1 a = − 1 ( 1 / 3 )( 1 / 3 ) ( 2 / 3 )( 1 / 3 ) a = 1 ( 1 / 2 )( 2 / 3 ) ( 1 / 2 )( 2 / 3 )

which simplifies to

PA , B ( a , b ) b = 0 b = 1 a = − 1 1 / 9 2 / 9 a = 1 1 / 3 1 / 3

(b) If A = 1, then the conditional expectation of B is

E [ B | A = 1 ] =

1

b = 0

bPB | A ( b | 1 ) = PB | A ( 1 | 1 ) = 1 / 2

(c) Before finding the conditional PMF PA | B ( a | 1 ), we first sum the columns of the joint PMF table to find

PB ( b ) =

4 / 9 b = 0 5 / 9 b = 1

The conditional PMF of A given B = 1 is

PA | B ( a | 1 ) =

PA , B ( a , 1 ) PB ( 1 )

2 / 5 a = − 1 3 / 5 a = 1

(d) Now that we have the conditional PMF PA | B ( a | 1 ), calculating conditional expectations is easy.

E [ A | B = 1 ] = ∑

a =− 1 , 1

aPA | B ( a | 1 ) = − 1 ( 2 / 5 ) + ( 3 / 5 ) = 1 / 5

E

[

A^2 | B = 1

]

a =− 1 , 1

a^2 PA | B ( a | 1 ) = 2 / 5 + 3 / 5 = 1

The conditional variance is then

Var [ A | B = 1 ] = E

[

A^2 | B = 1

]

− ( E [ A | B = 1 ])^2 = 1 − ( 1 / 5 )^2 = 24 / 25

(e) To calculate the covariance, we need

E [ A ] = ∑

a =− 1 , 1

aPA ( a ) = − 1 ( 1 / 3 ) + 1 ( 2 / 3 ) = 1 / 3

E [ B ] =

1

b = 0

bPB ( b ) = 0 ( 4 / 9 ) + 1 ( 5 / 9 ) = 5 / 9

E [ AB ] = ∑

a =− 1 , 1

1

b = 0

abPA , B ( a , b )

= − 1 ( 0 )( 1 / 9 ) + − 1 ( 1 )( 2 / 9 ) + 1 ( 0 )( 1 / 3 ) + 1 ( 1 )( 1 / 3 ) = 1 / 9

The covariance is just

Cov [ A , B ] = E [ AB ] − E [ A ] E [ B ] = 1 / 9 − ( 1 / 3 )( 5 / 9 ) = − 2 / 27