

Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
The concept of expectation of a real-valued random variable and its relation to the distribution of the variable. The lecture covers the definition of expectation, the difference between expectation and the variable's actual value, and the misconceptions about expectations. Additionally, the document introduces the concept of linearity of expectation and its applications in calculating the expectation of a sum of random variables.
Typology: Lecture notes
1 / 3
This page cannot be seen from the preview
Don't miss anything!
We discuss the expectation of a real valued random variable.
When a random variable takes a number as a value, it makes sense to talk about the average value that it takes. The expected value of a random variable is defined:
x
p(X = x) · x. ( 1 )
For example, let X be the roll of a 6-sided die. The expectation
E [X] =^
If X is the number of heads when we toss a coin n times, the ex- pected value of X is
(n 0 ) 2 n^
(n 1 ) 2 n^
(nn) 2 n^ · n. ( 2 )
E [X]^ gives the^ center^ of mass of the distribution of^ X. However, there are many different distributions that can have the same expecta- tion. For example, let X, Y, Z be random variables such that
1000 with probability 1/2, − 1000 with probability 1/2.
1 with probability n− n 1 , −(n − 1 ) with probability (^1) n.
0 with probability 1.
Then note that (^) E [X] = (^) E [Y] = (^) E [Z] = 0, even though these three variables have vastly different distributions. There are several misconceptions that people have about expectations.
lecture 12: expectation, and linearity of expectation 2
The formula for expectation given in ( 1 ) is not always the easiest way to calculate the expectation. Here are some observations that can make it much easier to calculate the expectation. The first extremely useful concept is the notion of linearity of expectation.
Fact 1. If X and Y are real valued random variables in the same probability space, then (^) E [X + Y] = (^) E [X] + (^) E [Y].
Proof.
z
p(X + Y = z) · z
x,y
p(X = x, Y = y) · (x + y)
x,y
x,y
p(X = x, Y = y) · y.
Now we can express the first term
x,y
x,y
p(X = x) · x · p(Y = y|X = x)
x
p(X = x) · x
y
p(Y = y|X = x)
x
p(X = x) · x = (^) E [X].
Similarly, the second term is E [Y].
More generally, we have that for any real numbers α , β ,
E [ α X^ +^ β Y] =^ α^ ·^ E [X] +^ β^ ·^ E [Y]^.
The amazing thing is that linearity of expectation even works when the random variables are dependent. This does not hold, for example, with multiplication—in general (^) E [X · Y] 6 = (^) E [X] · (^) E [Y]. However, when X, Y are independent, we do have (^) E [X · Y] = (^) E [X] · (^) E [Y].