Home > Preview
The flashcards below were created by user
mechtech2081
on FreezingBlue Flashcards.

In order for a matric B to be the inverse of A, both equations AB=I and BA=I must be true
True, by definition of invertible.

If A and B are n x n and invertible, the A^{1}B^{1} is the inverse of AB.
 False.
 (AB)^{1} = B^{1}A^{1}

if A = [a b / c d] and abcd != 0, then A is invertible.
False. If A[1 1 / 0 0], then ab – cd = 1 – 0 ≠ 0, but Theorem 4 shows that this matrix is not invertible, because ad – bc = 0.

Each elementary matrix is invertible.
True

If the equation Ax=0 has only the trivial soluation, then A is row equivalent to the n x n matrix.
True, by the IMT. If statement (d) of the IMT is true, then so is statement (b).

If the columns of A span R^{n}, then the columns are linearly independent.
True. If statement (h) of the IMT is true, then so is statement (e).

If A is an n x n matrix, the the equation Ax=b has at least one solution for each b in R^{n}.
False. Statement (g) of the IMT is true only for invertible matrices.

If A^{T} is not invertible, then A is not invertible.
True, by the IMT. If A^{T} is not invertible, then statement (1) of the IMT is false, and hence statement(a) must also be false.

If the equation Ax=0 has a nontrivial solution, then A has fewer than n pivot positions.
True, by the IMT. If the equation Ax = 0 has a nontrivial solution, then statement (d) of the IMT is false. In this case, all the lettered statements in the IMT are false, including statement (c), which means that A must have fewer than n pivot positions.

If there is an n x n matrix D such that AD=I, then there is also an nxn matrix C such that CA=I.
True. If statement (k) of the IMT is true, then so is statement ( j).

If the columns of A are linearly independent, then the columns of A span R^{n}.
True. If statement (e) of the IMT is true, then so is statement (h).

If the equation Ax=b has at least one solution for each b in R^{n}, then the solution is unique for each b.
True. See the remark immediately following the proof of the IMT.

If the linear transformation x>Ax maps R^{n} into R^{n}, then A has n pivot positions.
False. The first part of the statement is not part (i) of the IMT. In fact, if A is any n×n matrix, the linear transformation x>Ax maps R^{n} into R^{n}, yet not every such matrix has n pivot positions.

If C is 6x6 and the equation Cx=v is consistent for every v in R^{6}, is it possible that for some v, the equation Cx=v has more than one solution?
By (g) of the IMT, C is invertible. Hence, each equation Cx = v has a unique solution.

If nxn matrices E and F have the property that EF=I, then E and F commute.
By the box following the IMT, E and F are invertible and are inverses. So FE = I = EF, and so E and Fcommute.

If the equation Gx=y has more than one solution for some y in R^{n}, can the columns of G span R^{n}?
The matrix G cannot be invertible, So (h)of the IMT is false and the columns of G do not span Rn.

If L is a nxn and the equation Lx=0 has the trivial soluation, do the columns of L span R^{n}?
No conclusion about the columns of L may be drawn, because no information about L has been given.The equation Lx = 0 always has the trivial solution.

The Invertible Matrix Thereom
Let A be a nxn matrix then these statements are either all true or all false.
 a. A is an invertible matrix.
 b. A is a row equivalent to the nxn identity matrix.
 c. A has n pivot positions.
 d. The equation Ax=o has only the trivial solution.
 e. The columns of A form a linearly independent set.
 f. The linear transformation x> Ax is onetoone.
 g. The equation Ax=b has at least one solution for each b in R^{n}.
 h. The columns of A span R^{n}.
 i. The linear transformation x> Ax maps R^{n} onto R^{n}.
 j. There is an nxn matrix C such that CA=I.
 k. There is an nxn matrix D such that AD=I.
 l. A^{T} is an invertible matrix.

A subspace of R^{n} is any set H such that (i) the zero vector is in H, (ii) u, v, and u+v are in H, and (iii) c is a scalar and cu is in H.
False. See the definition at the beginning of the section. The critical phrases “for each” are missing.

If v_{1},....v_{p} are in R^{n}, then Span{v_{1}....,v_{p}} is the same as the column space of the matrix [v_{1}...v_{p}].
True. See the paragraph before Example 4.

The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of R^{m}.
False. See Theorem 12. The null space is a subspace of Rn, not Rm.

Row operations do not affect linear dependence relations among the columns of a matrix.
True. See the first part of the solution of Example 8.

A subset H of R^{n} is a subspace if the zero vector is in H.
False. See the definition at the beginning of the section. The condition about the zero vector is only one of the conditions for a subspace.

Given vectors v_{1}...,v_{p} in R^{n}, the set of all linear combinations of these vectors is a subspace of R^{n}.
True. See Example 3.

The null space of an m x n matrix is a subspace of R^{n}.
True. See Theorem 12.

The column space of a matrix A is the set of solutions of Ax=b.
False. See the paragraph after Example 4.

If B is an echelon form of a matrix A, then the pivot columns of B form a basis for Col A.
False. See the Warning that follows Theorem 13.

Suppose a 3x5 matrix A has three pivot columns. Is Col A=R^{3}? Is Nul A=R^{2}?
Col A = R3, because A has a pivot in each row and so the columns of A span R3. Nul A cannot equal R2,because Nul A is a subspace of R5. It is true, however, that Nul A is twodimensional. Reason: the equation Ax = 0 has two free variables, because A has five columns and only three of them are pivot columns.

The columns of an invertible nxn matrix form a basis for R^{n}.
True. See Example 5.

An n x n determinant is defined by determinants of
(n1)x(n1) submatrices.
True. See the paragraph preceding the definition of the determinant.

The (i,j)cofactor of a matrix A is the matrix A_{ij} obtained by deleting from A its ith row and jth column.
False. See the definition of cofactor, which precedes Theorem 1.

The cofactor expansion of det A down a column is the negative of the cofactor expansion along a row.
False. See Theorem 1.

The determinant of a triangular matrix is the sum of the entries on the main diagonal.
False. See Theorem 2.

If B={v_{1}...,v_{p}} is a basis for a subspace H and if x=c_{1}v_{1}+....+c_{p}v_{p}, then c_{1}....,c_{p} are the coordinates of x relative to the basis B.
True. This is the definition of a Bcoordinate vector.

Each line in R^{n} is a onedimensional subspace of R^{n}.
False. Dimension is defined only for a subspace. A line must be through the origin in Rn to be a subspace of Rn.

The dimension of Col A is the number of pivot columns of A.
True. The sentence before Example 1 concludes that the number of pivot columns of A is the rank of A, which is the dimension of Col A by definition.

The dimensions of Col A and Nul A add up to the number of columns of A.
True. This is equivalent to the Rank Theorem because rank A is the dimension of Col A.

If a set of p vectors spans a pdimensional subspace H of R^{n}, then these vectors form a basis for H.
True, by the Basis Theorem. In this case, the spanning set is automatically a linearly independent set.

If B is a basis for a subspace H, then each vector H can be written in only one way as a linear combination of the vectors in B.
True. This fact is justified in the second paragraph of this section.

If B={v_{1}....,v_{p}} is a basis for a subspace H of R^{n}, then the correspondence x>[x]_{B} makes H look and act the same as R^{p}.
True. See the second paragraph after Fig. 1.

The dimension of Nul A is the number of variables in the equation Ax=0.
False. The dimension of Nul A is the number of free variables in the equation Ax = 0.

The dimension of the column space of A is rank A.
True, by the definition of rank.

If H is a pdimensional subspace of R^{n}, then a linearly independent set of p vectors in H is a basis for H.
True, by the Basis Theorem. In this case, the linearly independent set is automatically a spanning set.

A row replacement operation does not affect the determinant of a matrix.
True. See Theorem 3.

The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (1)_{r}, where r is the number of row interchanges made during row reduction from A to U.
True. See the paragraph following Example 2.

If the columns of A are linearly dependent, the det A=0.
True. See the paragraph following Theorem 4.

det(A+B)=det A+det B
False. See the warning following Example 5.

If two row interchanges are made in succession, then the new determinant equals the old determinant.
True. See Theorem 3.

The determinant of A is the product of the diagonal entries in A.
True.

If det A is zero , then two rows or two columns are the same, or a row or a column is zero.
False. See Example 3.

det A^{T} = (1)det A
False. See Theorem 5.

