Econ7630_Exam1a

Card Set Information

Author:
mattstam
ID:
65858
Filename:
Econ7630_Exam1a
Updated:
2011-02-13 13:32:12
Tags:
econometrics probability
Folders:

Description:
Exam 1 (cards part a) for graduate level Econometrics 1 at LSU for Spring 2011. Includes many not too terribly rigorous "proofs."
Show Answers:

Home > Flashcards > Print Preview

The flashcards below were created by user mattstam on FreezingBlue Flashcards. What would you like to do?


  1. PDF Facts
    P[X = x] >= 0

    Sum[f(x)] = 1
  2. Random Variable
    A function that assigns a unique numerical value to each sample space outcome.
  3. CDF
    
  4. Given this discrete CDF, what is P[X<=1.5]?

    x f(x)

    1 0.1
    2 0.2
    3 0.3
    4 0.4
    0.1
  5. Joint PDF: f (X, Y) for discrete r.v.
    P[X = x, Y = y]
  6. Joint PDF: f (X, Y) for continous r.v. over a certain range.
  7. Marginal PDF: f (x) for discrete r.v.
    f(x) = sumy[f(x,y)]

    f(y) = sumx[f(x,y)]
  8. Marginal PDF: f (X) for continous r.v.
  9. Conditional Probability: P[X = x | Y = y] (i.e. f(X|Y))
    = P[X = x, Y = y]/P[Y = y] = f(x, y)/f(y)

    The conditional probabilty that X equals x given that Y equals y is equal to the joint PDF of X and Y divided by the marginal PDF of y.
  10. X, Y are independent IFF
    (1) f(x, y) = f(x)f(y) for all x, y.

    • Or (derived from (1))
    • (2) f(x|y) = f(x),
    • (3) f(y|x) = f(y) for all x, y.
  11. Given that X, Y are independent show that f(x|y) = f(x) for all x, y.
    • We know that
    • (i) X,Y are independent if f(x,y) = f(x)f(y) for all x,y.
    • (ii) f(x|y) = f(x,y)/f(y)

    So by (i) f(x|y) = f(x)f(y)/f(y) -> f(x|y) = f(x).
  12. Expected Value: E[X] for
    (1) Discrete
    (2) Continous
  13. What does Expected Value mean for a discrete random variable?
    The average value of X on an infinite number of experimental trials.

    For example, E[X] for a six sided fair die is 3.5. Clearly saying something like the likely value of a die role is 3.5 does not make sense. Average makes sense.
  14. Let g(X) be a function of X. What is E[g(X)]?
    • 
    • Remember: even though X is being transformed by g(X), you still use the original PDF f(x).
  15. If g(X) equals c, a constant, then E[g(X)] =
    E[c] = c

    Remember: The Expected Value of a constant is that constant. This allows you to pull constants out of the E[ ] operator.
  16. Show that E[c] = c, where c is a constant.
    Let g(X) = c -> E[g(X)] = integral[g(x)f(x)dx] = integral[cf(x)dx] = c*integral[f(x)dx] = c(1) = c
  17. If g(X) = aX + b, then E[g(X)] =
    E[aX + b] = aE[X] + b
  18. Show that E[aX + b] = aE[X] + b
    Let g(X) = aX + b -> E[g(X)] = E[aX + b] = E[aX] + E[b] -> aE[X] + b
  19. Let g(X) = g1(X) + g2(X) + . . . + gn(X),

    what is E[g(X)]?
    • Remember: the expected value of the sum is the sum of the expected values.
  20. Var[X]
    = E[(X - mu)2] = E[X2] - mu2, where mu = E[X].

    • Proof:
    • E[(X - mu)2] = E[(X - mu)(X - mu)] = E[X2 - 2muX + mu2] = E[X2] - 2muE[X] + mu2 = E[X2] - 2mu*mu + mu2 = E[X2] - 2mu2 + mu2 = E[X2] - mu2
  21. Let Y = a + bX, what is Var[Y]?
    = b2Var[X]

    • Proof:
    • Let Y = a + bX -> E[Y] = a + bE[Y]. Then Var(Y) =
  22. Standardized Variable:

    z =
    • (x - mu)/sig
    • where sig = sqrt[Var(X)] (i.e. the standard deviation)
  23. Show that E[z] = 0
    • E[z] = E[(1/sig)X - mu/sig]
    • = (1/sig)E[X] - E[mu/sig]
    • = mu/sig - mu/sig = 0
  24. Show that Var(z) = 1
    • Var(z) = Var[(1/sig)X - mu/sig]
    • = (1/sig)2Var(X) = sig2/sig2 = 1

    Remember: to get to line two, Var(aX +- b) = a2Var(X).
  25. Let g(X1, X2) have joint PDF f(X1, X2).

    E[g(X1, X2)] =
    
  26. cov(X1, X2)
    • = E[X1, X2] - mu1mu2
    • Derivation:
    • 
  27. If cov(X,Y)
    1. > 0
    2. < 0
    3. = 0
    • 1. (X, Y) pairs tend to be both greater than their means or both less than their means.
    • 2. (X, Y) pairs tend to be mixed about their means (one greater and one less)
    • 3. (X, Y) pairs "evenly" spread about their means.
  28. Correlation
    
  29. If p (for correleation)
    1. = 1
    2. = -1
    3. = 0
    • 1. perfect positive relation
    • 2. perfect negative relation
    • 3. no linear relationship

    Remember: absolute_value(p) measure the strength of the linear relationship.
  30. When X1 and X2 are independent, E[X1, X2]
    = mu1mu2

    • Proof:
  31. If X, Y are independent then cov(X, Y)
    = 0

    Remember: the converse is not true.

    • Proof:
    • We know E[X, Y] = muXmuY, when X, Y ind.

    cov[X, Y] = E[X, Y]-muXmuY = muXmuY-muXmuY = 0
  32. Let c, d be constants

    E[cX + dY]
    = cE[X] + dE[Y] = c*muX + d*muY
  33. Var(c1X1 + c2X2)
    • c12Var(X1) + c22Var(X2) + 2c1c2Cov(X1, X2)
    • Proof:
    • Var(c1X1 + C2X2)
  34. E[X|Y=y] for continous
    • = integral(x*f(x|y)dx)
    • Note: f(x|y) = f(x,y)/f(y)
  35. E[a + bX | X]
    = a + b*mu
  36. E[g(X)|X]
    = E[g(X)]
  37. Law of Iterated Expectations

    E[Y]
    • = Ex[Ey(Y|X)]
    • Proof:
  38. If E[Y|X] = E[Y] then cov(X,Y)
    • = 0
    • Proof:
    • 
  39. Var(X|Y)
    = E[X2|Y] - E[X|Y]2
  40. Normal Distribution
  41. Standard Normal Distribution

    Z~(0,1)
    
  42. If X~N(mu,sig2) and Y=aX + b, then Y~
    Y~N((a*mu +b), a2sig2)
  43. If X,Y are normal then cov(X,Y)
    cov(X,Y) = 0 <=> X,Y are independent.

What would you like to do?

Home > Flashcards > Print Preview