
PwA Cheatsheet
Common Distributions
Discrete
Name pmf cdf mean variance
Binomial(n,p) n
kpk(1−p)n−kF(k;n,p) = Pr(X≤k) =
∑⌊k⌋
i=0n
ipi(1−p)n−inp np(1−p)
Neg. Binomial(r,p) i−1
r−1pr(1−p)i−r-r
pr1−p
p2
Bernoulli(p) q= (1−p)for k=0
pfor k=1
0 for k<0
1−pfor 0 ≤k<1
1for k≥1
p p(1−p)
Uniform(a,b) 1
n,n=b−a+1⌊k⌋−a+1
n
a+b
2
(b−a+1)2−1
12
Geometric(p) p(1−p)i−11−(1−p)i1
p
1−p
p2
Hypergeometric(N,K,n)
"k successes ⊂N, K suc ∈N"
(K
k)(N−K
n−k)
(N
n)-nK
NnK
N
(N−K)
N
N−n
N−1
Poisson(λ)λke−λ
k!e−λ∑⌊k⌋
i=0
λi
i!λ λ
Continuous
Name pdf cdf mean variance
Uniform(a,b) 1
b−afor x∈[a,b]
0 otherwise
0 for x<a
x−a
b−afor x∈[a,b)
1 for x≥b
a+b
2
(b−a)2
12
Normal(µ,ω2)1
p2σ2πe−(x−µ)2
2σ21
21+erf x−µ
σp2 µ σ2
Exponential(λ)λe−λx1−e−λx1/λ 1/λ2
Hazard/Failure Rate Functions
Survival Hazard Distribution Rate Book
¯
F(t) = 1−F(t)λ(t) = f(t)
¯
F(t)F(t) = 1−ex p{− ∫t
0λ(t)d t}λp217
Events
Sample Space S={al l po ssi b le ou t co mes}
Event E⊂S
Union (either or
both) E∪F
Intersection (both) E∩For EF
Complement EC=S\E⇒P(EC) = 1−P(E)
Inclusion-
Exclusion ,→P(A∪B) = P(A) + P(B)−P(A∩B)
DeMorgan’s Law 1. (E1∪...∪En)C=EC
1∩. . . ∩EC
n
2. (E1∩...∩En)C=EC
1∪. . . ∪EC
n
Axioms
1. 0 ≤P(E)≤1
2. P(S)1
3. For mutually excl. events Ai,i≥1:
P(∪∞
i=1Ai) = ∑∞
i=1P(Ai)
Finite S, Equal Probability
for all point sets: P(A) =|A| ÷ | S|
Odds of Event α=P(A)
P(AC)=P(A)
1−P(A)
Conditional Probability and Independence I
Conditional Proba-
bility P(F|E) = P(F∩E)
P(E)
Independence if P(F∩E) = P(F)P(E)
Multiplication Rule P(E1E2···En) = P(E1)P(E2|E1)···P(En|
E1···En−1)
Bayes Formula
(simple) P(A|B) = P(B|A)P(A)
P(B)·
Bayes Formula
(full) P(Ai|B) = P(B|Ai)P(Ai)
∑jP(B|Aj)P(Aj)·
Conditional pmf
(discrete) pX|Y(x|y) = p(x,y)
pY(y)
Conditional pdf
(discrete) FX|Y(x|y) = ∑a≤xpX|Y(a|y)
Conditional Den-
sity (continuous) fX|Y(x|y) = f(x,y)
fY(y)
Conditional Prob-
abilities (continu-
ous)
P{X∈A|Y=y}=∫AfX|Y(x|y)d x
Random Variables (Discrete)
Distribution Func-
tion F(x) = P{X≤x}
Probability Mass
Function p(x) = P X =x
Joint Probability
Mass Function
P(X=xand Y=y)
=P(Y=y|X=x)·P(X=x)
=P(X=x|Y=y)·P(Y=y)
Expectation E[X] = ∑x:p(x)>0x p(x)
,→not e :E[g(X)] = ∑x:p(x)>0g(x)p(x)
Variance Var(X) = E[( X−E[X])2]
=E[X2]−(E[X])2
Standard Deriva-
tion σ=pVa r(X)
Covariance Co v(X,Y) = E[(X−E[X])(Y−E[Y])]
=E[X Y ]−E[X]E[Y]
Moment Gener. Function M(t) = E[et X ](same for continuous RVs)
Random Variables (Continuous) I
Probability Density
Function fsuch that P{X∈B}=∫Bf(x)d x
Distribution Func-
tion Fsuch that d
dx F(x) = f(x)
Expectation E[X] = ∫∞
−∞ x f (x)d x
,→not e :E[g(X)] = ∫∞
−∞ g(x)f(x)d x
Variance Var(X) = E[( X−E[X])2]
=E[X2]−(E[X])2
Standard Deriva-
tion σ=pVa r(X)
Covariance Co v(X,Y) = E[(X−E[X])(Y−E[Y])]
=E[X Y ]−E[X]E[Y]
Joint Probability
Mass Function
P{(X,Y)∈C}=∫∫(x,y)∈C
f(x,y)d xd y
P{X∈A,Y∈B}=∫B∫A
f(x,y)d xd y
Random Variables (Continuous) II
Marginal pmfs
fX(x) = ∫∞
−∞
f(x,y)d y
fY(y) = ∫∞
−∞
f(x,y)d x
More on Expectation, Variance, ..
E[X+Y] = E[X] + E[Y]
E[αX] = αE[X]
Va r(X+a) = Var (X)
Va r(aX +b) = a2Va r(X)
Va r(X+Y) = E[(X+Y)2]−(E[X+Y])2
=E[X2+2X Y +Y2]−(E[X] + E[Y])2
=E[X2] + 2E[X Y ] + E[Y2]−
(E[X])2−2E[X]E[Y]−(E[Y])2
=Va r(X) + Var (Y)+ 2(E[X Y ]−E[x]E[Y])
=Va r(X) + Var (Y)+ 2(C ov(X,Y))
Independence
⇒E[f(X)g(Y)] = E[f(X)]E[g(Y)]
⇒E[X Y ] = E[X]E[Y]
⇒Co v(X,Y) = 0
⇒Va r(X+Y) = Var (X)+ Va r(Y)
Correlation co r r(X,Y) = ρ(X,Y) = C ov(X,Y)
pVar (X)Var(Y)
1. −1≤ρ(X,Y)≤1
2. Independence ⇒ρ(X,Y) = 0
3.
Y=mX +cm,m=0a nd c :
m>0⇒ρ(X,Y) = 1
m<0⇒ρ(X,Y) = −1
E[X] = E[E[X|Y]]
Disc.: E[X] = ∑yE[X|Y=y]P{Y=y}
Cont.: E[X] = ∫∞
−∞ E[X|Y=y]fy(y)d y