Linear Algebra 2

Card Set Information

Author:
mechtech2081
ID:
111299
Filename:
Linear Algebra 2
Updated:
2011-11-09 21:17:25
Tags:
Linear Algebra Eigenvalues Eigenvectors
Folders:

Description:
Linear Algebra
Show Answers:

Home > Flashcards > Print Preview

The flashcards below were created by user mechtech2081 on FreezingBlue Flashcards. What would you like to do?


  1. If Ax=λx for some vector x, then λ is an eigenvalue of A.
    False. The equation Ax = λx must have a nontrivial solution.
  2. A matrix A is not invertible if and only if 0 is an eigenvalue of A.
    True. See the paragraph after Example 5.
  3. A number c is an eigenvalue of A if and only if the equation (A-cl)x=0 has a nontrivial solution.
    True. See the discussion of equation (3).
  4. Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy.
    True. See Example 2 and the paragraph preceding it.
  5. To find the eigenvalues of A, reduce A to echelon form.
    False. See the warning after Example 3.
  6. If Ax=λx for some scalar λ, then x is an eigenvector of A.
    False. The vector x in Ax = λx must be nonzero.
  7. If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues.
    False. See Example 4 for a two-dimensional eigenspace, which contains two linearly independenteigenvectors corresponding to the same eigenvalue. The statement given is not at all the same asTheorem 2. In fact, it is the converse of Theorem 2 (for the case r = 2 ).
  8. A steady-state vector for a stochastic matrix is actually an eigenvector.
    True. See the paragraph after Example 1.
  9. The eigenvalues of a matrix are on its main diagonal.
    False. Theorem 1 concerns a triangular matrix. See Examples 3 and 4 for counterexamples.
  10. An eigenspace of A is a null space of a certain matrix.
    True. See the paragraph following Example 3. The eigenspace of A corresponding to λ is the nullspace of the matrix A − λI.
  11. The determinant of A is the product of the diagonal entries in A.
    False. See Example 1.
  12. An elementary row operation on A does not change the determinant.
    False. See Theorem 3.
  13. (det A)(det B) = det AB
    True. See Theorem 3.
  14. If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A.
    False. See the solution of Example 4.
  15. If A is 3x3, with columns a1, a2, a3, then det A equals the volume of the parallelepiped determined by a1, a2, a3.
    False. See the paragraph before Theorem 3.
  16. det AT = (-1)det A.
    False. See Theorem 3.
  17. The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A.
    True. See the paragraph before Example 4.
  18. A row replacement operation of A does not change the eigenvalues.
    False. See the warning after Theorem 4.
  19. Inner Product
    • u·v = uT·v
    • aka dot product
  20. Length
    • ||v|| = √v·v = √v12 + v22...vn2
    • ||v||2 = v·v
  21. orthogonal
    • if vectors u & v = 0
    • iff ||u+v||2 = ||u||2 + ||v||2
  22. Not every otthogonal set in Rn is linearly independent.
    True. But every orthogonal set of nonzero vectors is linearly independent. See Theorem 4.
  23. If a set S = {u1...up} has the property that ui·uj = 0 whenever i != j, then S is an orthogonal set.
    False. To be orthonormal, the vectors is S must be unit vectors as well as being orthogonal to each other.
  24. If the columns of an m x n matrix A are orthnormal, then the linear mapping x --> Ax preserves lengths.
    True. See Theorem 7(a).
  25. The orthogonal proection of y onto v is the same as the orthogonal projection of y onto cv whenever c != 0.
    True. See the paragraph before Example 3.
  26. An orthogonal matrix is invertible.
    True. See the paragraph before Example 7.
  27. Suppose W is a subspace of Rn spanned by n nonzero orthogonal vectors. Explain why W = Rn.
    A set of n nonzero orthogonal vectors must be linearly independent by Theorem 4, so if such a set spans W it is a basis for W. Thus W is an n-dimensional subspace of Rn, and W = Rn.
  28. Orthongonal Basis
    For a subspace W of Rn is a basis for W that is also an orthogonal set.

    c1 = y . u1/(u1 . u1)
  29. If z is orthogonal to u1 and to u2 and if W = Span {u1, u2} then z must be in Wperp.
    True. See the calculations for 2 z in Example 1 or the box after Example 6 in Section 6.1.
  30. For each y and each subspace W, the vector y - projWy is orthogonal to W.
    True. See the Orthogonal Decomposition Theorem.
  31. The orthogonal projection ŷ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute ŷ.
    False. See the last paragraph in the proof of Theorem 8, or see the second paragraph after thestatement of Theorem 9.
  32. If y is in a subspace W, then the orthogonal projection of y onto W is y itself.
    True. See the box before the Best Approximation Theorem.
  33. If the columns of a n x p matrix U are orthonormal, then UUTy is the orthogonal projection of y onto the column space of U.
    True. Theorem 10 applies to the column space W of U because the columns of U are linearlyindependent and hence form a basis for W.
  34. Orthogonal Projection
    ŷ = ((y·u) / (u·u))*u
  35. The distance from y to L.
    Is the length of the perpendicular line segment from y to the orthofonal projection ŷ.

    ||y-ŷ|| = √(y-ŷ)2
  36. An m x n matrix U has orthonormal columns if and only if
    UTU = 1
  37. Let U be an m x n matrix with orthonormal columns, and let x and y be in Rn
    Then:
    a. ||Ux|| =
    b. (Ux)·(Uy) =
    c. (Ux)·(Uy) =
    • a. ||x||
    • b. x·y
    • c. 0 if and only if x·y = 0
  38. The general least-sqaures problem is to find an x that makes Ax as close as possible to b.
    True. See the beginning of the section. The distance from Ax to b is || Ax– b||.
  39. A least-squares solution of Ax=b is a vector ẋ that satifises Aẋ = ḃ, whereḃ is the orthogonal projection of b onto ColA.
    True. See the comments about equation (1).
  40. A least-squares solution of Ax=b is a vector ẋsuch that ||b-Ax|| <= ||b-Aẋ|| for all x in Rn.
    False. The inequality points in the wrong direction. See the definition of a least-squares solution.
  41. Any solution of ATAx=ATb is a least-squares solution of Ax=b.
    True. See Theorem 13.
  42. If the columns of A are linearly independent, then the equations Ax=b has exactly one least-squares solution.
    True. See Theorem 14.

What would you like to do?

Home > Flashcards > Print Preview