The flashcards below were created by user
mattstam
on FreezingBlue Flashcards.

PDF Facts
P[X = x] >= 0
Sum[f(x)] = 1

Random Variable
A function that assigns a unique numerical value to each sample space outcome.

CDF

Given this discrete CDF, what is P[X<=1.5]?
x f(x)
1 0.1
2 0.2
3 0.3
4 0.4
0.1

Joint PDF: f (X, Y) for discrete r.v.
P[X = x, Y = y]

Joint PDF: f (X, Y) for continous r.v. over a certain range.

Marginal PDF: f (x) for discrete r.v.
f(x) = sum_{y}[f(x,y)]
f(y) = sum_{x}[f(x,y)]

Marginal PDF: f (X) for continous r.v.

Conditional Probability: P[X = x  Y = y] (i.e. f(XY))
= P[X = x, Y = y]/P[Y = y] = f(x, y)/f(y)
The conditional probabilty that X equals x given that Y equals y is equal to the joint PDF of X and Y divided by the marginal PDF of y.

X, Y are independent IFF
(1) f(x, y) = f(x)f(y) for all x, y.
 Or (derived from (1))
 (2) f(xy) = f(x),
 (3) f(yx) = f(y) for all x, y.

Given that X, Y are independent show that f(xy) = f(x) for all x, y.
 We know that
 (i) X,Y are independent if f(x,y) = f(x)f(y) for all x,y.
 (ii) f(xy) = f(x,y)/f(y)
So by (i) f(xy) = f(x) f(y)/ f(y) > f(xy) = f(x).

Expected Value: E[X] for
(1) Discrete
(2) Continous

What does Expected Value mean for a discrete random variable?
The average value of X on an infinite number of experimental trials.
For example, E[X] for a six sided fair die is 3.5. Clearly saying something like the likely value of a die role is 3.5 does not make sense. Average makes sense.

Let g(X) be a function of X. What is E[g(X)]?

 Remember: even though X is being transformed by g(X), you still use the original PDF f(x).

If g(X) equals c, a constant, then E[g(X)] =
E[c] = c
Remember: The Expected Value of a constant is that constant. This allows you to pull constants out of the E[ ] operator.

Show that E[c] = c, where c is a constant.
Let g(X) = c > E[g(X)] = integral[g(x)f(x)dx] = integral[cf(x)dx] = c*integral[f(x)dx] = c(1) = c

If g(X) = aX + b, then E[g(X)] =
E[aX + b] = aE[X] + b

Show that E[aX + b] = aE[X] + b
Let g(X) = aX + b > E[g(X)] = E[aX + b] = E[aX] + E[b] > aE[X] + b

Let g(X) = g_{1}(X) + g_{2}(X) + . . . + g_{n}(X),
what is E[g(X)]?
 Remember: the expected value of the sum is the sum of the expected values.

Var[X]
= E[(X  mu) ^{2}] = E[X ^{2}]  mu ^{2}, where mu = E[X].
 Proof:
 E[(X  mu)^{2}] = E[(X  mu)(X  mu)] = E[X^{2}  2muX + mu^{2}] = E[X^{2}]  2muE[X] + mu^{2} = E[X^{2}]  2mu*mu + mu^{2} = E[X^{2}]  2mu^{2} + mu^{2} = E[X^{2}]  mu^{2}

Let Y = a + bX, what is Var[Y]?
= b ^{2}Var[X]
 Proof:
 Let Y = a + bX > E[Y] = a + bE[Y]. Then Var(Y) =

Standardized Variable:
z =
 (x  mu)/sig^{}
 where sig = sqrt[Var(X)] (i.e. the standard deviation)

Show that E[z] = 0
 E[z] = E[(1/sig)X  mu/sig]
 = (1/sig)E[X]  E[mu/sig]
 = mu/sig  mu/sig = 0

Show that Var(z) = 1
 Var(z) = Var[(1/sig)X  mu/sig]
 = (1/sig)^{2}Var(X) = sig^{2}/sig^{2} = 1
Remember: to get to line two, Var(aX + b) = a ^{2}Var(X).

Let g(X_{1}, X_{2}) have joint PDF f(X_{1}, X_{2}).
E[g(X_{1}, X_{2})] =

cov(X_{1}, X_{2})
 = E[X_{1}, X_{2}]  mu_{1}mu_{2}
 Derivation:


If cov(X,Y)
1. > 0
2. < 0
3. = 0
 1. (X, Y) pairs tend to be both greater than their means or both less than their means.
 2. (X, Y) pairs tend to be mixed about their means (one greater and one less)
 3. (X, Y) pairs "evenly" spread about their means.

Correlation

If p (for correleation)
1. = 1
2. = 1
3. = 0
 1. perfect positive relation
 2. perfect negative relation
 3. no linear relationship
Remember: absolute_value(p) measure the strength of the linear relationship.

When X_{1} and X_{2} are independent, E[X_{1}, X_{2}]
= mu _{1}mu _{2}
 Proof:

If X, Y are independent then cov(X, Y)
= 0
Remember: the converse is not true.
 Proof:
 We know E[X, Y] = mu_{X}mu_{Y}, when X, Y ind.
cov[X, Y] = E[X, Y]mu _{X}mu _{Y} = mu _{X}mu _{Y}mu _{X}mu _{Y} = 0

Let c, d be constants
E[cX + dY]
= cE[X] + dE[Y] = c*mu_{X} + d*mu_{Y}

Var(c_{1}X_{1} + c_{2}X_{2})
 c_{1}^{2}Var(X_{1}) + c_{2}^{2}Var(X_{2}) + 2c_{1}c_{2}Cov(X_{1}, X_{2})
 Proof:
 Var(c_{1}X_{1} + C_{2}X_{2})

E[XY=y] for continous
 = integral(x*f(xy)dx)
 Note: f(xy) = f(x,y)/f(y)



Law of Iterated Expectations
E[Y]
 = E_{x}[E_{y}(YX)]
 Proof:

If E[YX] = E[Y] then cov(X,Y)
 = 0
 Proof:


Var(XY)
= E[X^{2}Y]  E[XY]^{2}


Standard Normal Distribution
Z~(0,1)

If X~N(mu,sig^{2}) and Y=aX + b, then Y~
Y~N((a*mu +b), a^{2}sig^{2})

If X,Y are normal then cov(X,Y)
cov(X,Y) = 0 <=> X,Y are independent.

