Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Random Variables and Expectation: Understanding Probabilities and Expected Values, Lecture notes of Mathematics

The concept of random variables and their expected values. A random variable is a probabilistic variable that can take on different values, and the document explains how to calculate the probabilities of these values using probability mass functions (pmfs) and cumulative distribution functions (cdfs). The document also covers the concept of expected value, which is the sum of the product of each value and its corresponding probability. Examples of calculating expected values for different random variables and provides a disclaimer for errors.

Typology: Lecture notes

2020/2021

Uploaded on 05/24/2021

tarley
tarley 🇺🇸

4.5

(58)

251 documents

1 / 2

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Random Variables and Expectation
Chris Piech
CS109
Handout #10
April 6th, 2016
Random Variable
A Random Variable (RV) is a variable that probabilistically takes on different values. You can think of an RV
as being like a variable in a programming language. They take on values, have types and have domains over
which they are applicable. We can define events that occur if the random variable takes one values that satisfy
a numerical test (eg does the variable equal 5, is the variable less than 8). We often think of the probabilities
of such events.
As an example, let’s say we flip three fair coins. We can define a random variable Yto be the total number
of “heads” on the three coins. We can ask about the probability of Ytaking on different values using the
following notation:
P(Y=0) = 1/8 (T, T, T)
P(Y=1) = 3/8 (H, T, T), (T, H, T), (T, T, H)
P(Y=2) = 3/8 (H, H, T), (H, T, H), (T, H, H)
P(Y=3) = 1/8 (H, H, H)
P(Y4) = 0
Using random variables is a convenient notation technique that assists in decomposing problems. There are
many different types of random variables (indicator, binary, choice, Bernoulli, etc). The two main families of
random variable types are discrete and continuous.
Probability Mass Function
Probability mass functions (PMF) is a function that maps possible outcomes of a random variable to the
corresponding probabilities. We can plot PMF graphs:
0
1/6
1 2 3 4 5 6
P(X=x)
x
0
0
1/36
2/36
3/36
4/36
5/36
6/36
1 2 3 4 5 6 7 8 9 10 11
P(X = x)
x
0
Figure: On the left, the PMF of a single 6 sided die roll. On the right, the PMF of the sum of two dice rolls.
Cumulative Distribution Function
For a random variable X, the Cumulative Distribution Function (CDF) is defined as:
F(a) = P(Xa)where <a<
pf2

Partial preview of the text

Download Random Variables and Expectation: Understanding Probabilities and Expected Values and more Lecture notes Mathematics in PDF only on Docsity!

Random Variables and Expectation

Chris Piech

CS

Handout # April 6th, 2016

Random Variable

A Random Variable (RV) is a variable that probabilistically takes on different values. You can think of an RV as being like a variable in a programming language. They take on values, have types and have domains over which they are applicable. We can define events that occur if the random variable takes one values that satisfy a numerical test (eg does the variable equal 5, is the variable less than 8). We often think of the probabilities of such events.

As an example, let’s say we flip three fair coins. We can define a random variable Y to be the total number of “heads” on the three coins. We can ask about the probability of Y taking on different values using the following notation:

  • P(Y = 0 ) = 1 / 8 (T, T, T)
  • P(Y = 1 ) = 3 / 8 (H, T, T), (T, H, T), (T, T, H)
  • P(Y = 2 ) = 3 / 8 (H, H, T), (H, T, H), (T, H, H)
  • P(Y = 3 ) = 1 / 8 (H, H, H)
  • P(Y ≥ 4 ) = 0

Using random variables is a convenient notation technique that assists in decomposing problems. There are many different types of random variables (indicator, binary, choice, Bernoulli, etc). The two main families of random variable types are discrete and continuous.

Probability Mass Function

Probability mass functions (PMF) is a function that maps possible outcomes of a random variable to the corresponding probabilities. We can plot PMF graphs:

0

1 / 6

1 2 3 4 5 6

P(X=x)

x

0 0

1/

2/

3/

4/

5/

6/

1 2 3 4 5 6 7 8 9 10 11

P(X = x)

x

0

Figure: On the left, the PMF of a single 6 sided die roll. On the right, the PMF of the sum of two dice rolls.

Cumulative Distribution Function

For a random variable X, the Cumulative Distribution Function (CDF) is defined as:

F(a) = P(X ≤ a) where − ∞ < a < ∞

Expected Value

The Expected Value for a discrete random variable X is defined as:

E[X] = ∑

x:P(x)> 0

xP(x)

It goes by many other names: Mean, Expectation, Weighted Average, Center of Mass, 1st Moment.

Example 1

Lets say you roll a 6-Sided Die and that a random variable X represents the outcome of the roll. What is the E[X]? This is the same as asking what is the average value.

E[X] = 1 ( 1 / 6 ) + 2 ( 1 / 6 ) + 3 ( 1 / 6 ) + 4 ( 1 / 6 ) + 5 ( 1 / 6 ) + 6 ( 1 / 6 ) = 7 / 2

Example 2

Lets say a school has 3 classes with 5, 10, and 150 students. If we randomly choose a class with equal probability and let X = size of the chosen class:

E[Y ] = 5 ( 1 / 3 ) + 10 ( 1 / 3 ) + 150 ( 1 / 3 ) = 165 / 3 = 55

If instead we randomly choose a student with equal probability and let Y = size of the class the student is in

E[X] = 5 ( 5 / 165 ) + 10 ( 10 / 165 ) + 150 ( 150 / 165 ) = 22635 / 165 = 137

Example 3

Consider a game played with a fair coin which comes up heads with p = 0.5. Let n = the number of coin flips before the first “tails”. In this game you win $2n. How many dollars do you expect to win? Let X be a random variable which represents your winnings.

E[X] =

i= 0

)i+ 1 2 i

i= 0

Linearity

Expectations preserve linearity which means that E[aX + b] = aE[X] + b

Indicator Random Variable

A variable I is called an indicator variable for an event A if I = 1 when A occurs and I = 0 if A does not occur. P(I = 1 ) = P(A) and E[I] = P(A). Indicator variables are useful for cleaning up notation.

Disclaimer: This handout was made fresh just for you. Notice any mistakes? Let Chris know.