Home > Flashcards > Print Preview
The flashcards below were created by user
rainwater739
on FreezingBlue Flashcards. What would you like to do?

What is the determinant of a triangular matrix?
det A = a_{11} * a_{22} * ... * a_{nn}
the product of the entries on the main diagonal

What affect do the row operations have on the determinant?
 For a square matrix A
 1. If a multiple of one row of A is added to another row to produce B, then det B = det A
 2. If two rows of A are interchanged to produce B, then det B = det A
 3. If one row of A is multiplied by a scalar k to produce B, then det B = k * det A

Is a square matrix invertible?
A square matrix A is invertible if and only if det A != 0

Does transposing a matrix change its determinant?
If A is a square matrix, then det A^{T} = det A

What is the multiplicative property of determinants?
If A and B are n * n matrices, then det AB = (det A)(det B)

What makes a linear system consistent?
 A linear system is consistent if and only if the rightmost column of the augmented matrix is not a pivot column. In other words, the echelon form of a matrix should not have a row in the form
 [0 ... 0 b] with b nonzero

What is a span?
If v_{1}, ... , v_{p} are in R^{n}, then the set of all linear combinations of v_{1}, ... , v_{p} is denoted by Span {v_{1}, ... , v_{p}} and is called a subset of R^{n} spanned by v_{1}, ... , v_{p}

What are the properties of the matrixvector product?
 If A is an m * n matrix, u and v are vectors in R^{n}, and c is a scalar, then
 1. A(u + v) = A(u) + A(v)
 2. A(cu) = c(Au)

Does a homogeneous equation have a nontrivial solution?
The homogeneous equation Ax = 0 has a nontrivial solution if and only if the equation has at least one free variable.

When is a set of vectors linearly independent?
 An indexed set of vectors {v_{1} ... v_{p}} in R^{n}^{ }is said to be linearly independent if the vector equation
 x_{1}v_{1} + x_{2}v_{2} + ... + x_{p}v_{p} = 0
 has the only trivial solution.

When is a set of vectors linearly dependent?
 The set {v_{1} ... v_{p}} is said to be linearly dependent if there exists weights c_{1}, c_{2}, ... , c_{p}, not all zero, such that
 c_{1}v_{1} + c_{2}v_{2} + ... + c_{p}v_{p} = 0

How can you tell if two vectors are linearly dependent?
A set of two vectors is linearly independent if at least one of the vectors is a multiple of the other.

If a set contains more vectors than there are entries in each vector, is the set linearly dependent or independent?
If a set contains more vectors than there are entries in each vector, the set would be linearly dependent.

If a set contains the zero vector, is the set linearly dependent or independent?
If a set contains the zero vector, the set would be linearly dependent.

What is a transformation?
A transformation (or function, or mapping) T from R^{n} to R^{m} is a rule that assigns te each vector x in R^{n} a vector T(x) in R^{m}, where the set R^{n} is the domain and the set R^{m} is the range.

How do you know if a transformation is linear?
 A transformation T is linear if, for all vectors u, v and all scalars c
 1. T(u + v) = T(u) + T(v)
 2. T(cu) = cT(u)

What are the properties of matrix addition and multiplication by scalars?
 Let A, B, and C be matrices of the same size, and let r and s be scalars
 1. A + B = B + A
 2. (A + B) + C = A + (B + C)
 3. A + 0 = A
 4. r(A + B) = rA + rB
 5. (r + s)A = rA + sA
 6. r(sA) = (rs)A

What are the properties of matrix multiplication?
 Let A be an m * n matrix, and let B and C have sizes for which the indicated sums and products are defined, and let r be any scalar
 1. A(BC) = (AB)C, associative law of multiplication
 2. A(B + C) = AB + AC, left distributive law
 3. (B + C)A = BA + CA, right distributive law
 4. r(AB) = (rA)B = A(rB)
 5. I_{m}A = A = AI_{n}

What are the properties of transposed matrices?
 Let A and B denote matrices whose sizes are appropriate for the following sums and products, and let r be any scalar
 1. (A^{T})^{T} = A
 2. (A + B)^{T} = A^{T} + B^{T}
 3. (rA)^{T} = rA^{T}
 ^{4.} (AB)^{T} = B^{T}A^{T}

What are the equivalent statements in the Invertible Matrix Theorem?
 Let A be a square n * n matrix, then the given statements are either all true or all false
 1. A is an invertible matrix
 2. A is row equivalent to the n * n identity matrix
 3. A has n pivots
 4. The equation Ax = 0 has the only trivial solution
 5. The columns of A form a linearly independent set
 6. The linear transformation x > Ax is onetoone
 7. The equation Ax = b has at least one solution for each b in R^{n}
 8. The columns of A span R^{n}
 9. The linear transformation x > Ax maps R^{n} onto R^{n}
 10. There is an n * n matrix C such that CA = I
 11. There is an n * n matrix D such that DA = I
 12. A^{T} is an invertible matrix

