Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Lecture 12: Expectation and Linearity of Expectation in Probability Theory, Lecture notes of Advanced Calculus

The concept of expectation of a real-valued random variable and its relation to the distribution of the variable. The lecture covers the definition of expectation, the difference between expectation and the variable's actual value, and the misconceptions about expectations. Additionally, the document introduces the concept of linearity of expectation and its applications in calculating the expectation of a sum of random variables.

Typology: Lecture notes

2021/2022

Uploaded on 09/12/2022

laksh
laksh 🇺🇸

5

(2)

223 documents

1 / 3

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Lecture 12: Expectation, and Linearity of Expecta-
tion
Anup Rao
April 26,2019
We discuss the expectation of a real valued random variable.
Expectation
When a random variable takes a number as a value, it makes
sense to talk about the average value that it takes. The expected value
of a random variable is defined:
E[X]=
x
p(X=x)·x. (1)
For example, let Xbe the roll of a 6-sided die. The expectation
E[X]=1
6·1+1
6·2+1
6·3+1
6·4+1
6·5+1
6·6=21
6=3.5.
If Xis the number of heads when we toss a coin ntimes, the ex-
pected value of Xis
E[X]=(n
0)
2n·0+(n
1)
2n·1+. . . +(n
n)
2n·n. (2)
E[X]gives the center of mass of the distribution of X. However,
there are many different distributions that can have the same expecta-
tion.
For example, let X,Y,Zbe random variables such that
X=
1000 with probability 1/2,
1000 with probability 1/2.
Y=
1 with probability n1
n,
(n1)with probability 1
n.
Z=n0 with probability 1.
Then note that E[X]=E[Y]=E[Z]=0, even though these
three variables have vastly different distributions. There are several
misconceptions that people have about expectations.
pf3

Partial preview of the text

Download Lecture 12: Expectation and Linearity of Expectation in Probability Theory and more Lecture notes Advanced Calculus in PDF only on Docsity!

Lecture 12 : Expectation, and Linearity of Expecta-

tion

Anup Rao

April 26 , 2019

We discuss the expectation of a real valued random variable.

Expectation

When a random variable takes a number as a value, it makes sense to talk about the average value that it takes. The expected value of a random variable is defined:

E [X] =^ ∑

x

p(X = x) · x. ( 1 )

For example, let X be the roll of a 6-sided die. The expectation

E [X] =^

If X is the number of heads when we toss a coin n times, the ex- pected value of X is

E [X] =

(n 0 ) 2 n^

(n 1 ) 2 n^

(nn) 2 n^ · n. ( 2 )

E [X]^ gives the^ center^ of mass of the distribution of^ X. However, there are many different distributions that can have the same expecta- tion. For example, let X, Y, Z be random variables such that

X =

1000 with probability 1/2, − 1000 with probability 1/2.

Y =

1 with probability n− n 1 , −(n − 1 ) with probability (^1) n.

Z =

0 with probability 1.

Then note that (^) E [X] = (^) E [Y] = (^) E [Z] = 0, even though these three variables have vastly different distributions. There are several misconceptions that people have about expectations.

lecture 12: expectation, and linearity of expectation 2

  • It is not necessarily true that a random variable will be close to its expectation with high probability. For example, X as defined above is never close to 0.
  • It is not necessarily true that a random variable will be above its expectation with probability about half and below with probability half. Consider Y above.

Linearity of Expectation

The formula for expectation given in ( 1 ) is not always the easiest way to calculate the expectation. Here are some observations that can make it much easier to calculate the expectation. The first extremely useful concept is the notion of linearity of expectation.

Fact 1. If X and Y are real valued random variables in the same probability space, then (^) E [X + Y] = (^) E [X] + (^) E [Y].

Proof.

E [X^ +^ Y] =^ ∑

z

p(X + Y = z) · z

x,y

p(X = x, Y = y) · (x + y)

x,y

p(X = x, Y = y) · x + ∑

x,y

p(X = x, Y = y) · y.

Now we can express the first term

x,y

p(X = x, Y = y) · x = ∑

x,y

p(X = x) · x · p(Y = y|X = x)

x

p(X = x) · x

y

p(Y = y|X = x)

x

p(X = x) · x = (^) E [X].

Similarly, the second term is E [Y].

More generally, we have that for any real numbers α , β ,

E [ α X^ +^ β Y] =^ α^ ·^ E [X] +^ β^ ·^ E [Y]^.

The amazing thing is that linearity of expectation even works when the random variables are dependent. This does not hold, for example, with multiplication—in general (^) E [X · Y] 6 = (^) E [X] · (^) E [Y]. However, when X, Y are independent, we do have (^) E [X · Y] = (^) E [X] · (^) E [Y].