MATH2121 Linear Algebra Quiz
Date: 4 April 2014 Time: 5:00 pm - 5:50 pm
Name:
Chan Oiloom Vernon
Student I.D.:
XXXXXXXX
• There are 4 problems in total. • Show all the steps. • You can write on both sides of the papers. • No toilet break. • You can leave early.
Problem No.
Marks
Out of
1
50
2
10
3
15
4
25
Total
100
1
Problem 1. True or False? Put down T or F. (2 mark for one right answer, -2 mark for one wrong answer and 0 mark for one blank) V and W will denote vector spaces and T will always be a linear transformation. (1) The set of invertible n × n matrices is a subspace of Mn×n (R).
(F)
(2) The set of nilpotent n × n matrices is a subspace of Mn×n (R). (A is nilpotent means Ak = 0 for some positive integer k.)
(F)
(3) {(A, B) | AB = BA} is a subspace of Mn×n (R) × Mn×n (R).
(F)
(4) {(x1 , x2 , x3 , . . . ) | xi+1 ≥ xi for all i} is a subspace of R∞ .
(F)
(5) {p ∈ P(R) | deg p is even} is a subspace of P(R) (the set of polynomials).
(F)
(6) {f ∈ C(R) | f (x + π) = f (x) for all x} is infinite-dimensional.
(T)
(7) The kernel of
d dx
(8) The kernel of
d dx |x=0
: C 1 (R) → C(R) is infinite-dimensional. : C 1 (R) → R is infinite-dimensional.
(9) Let T : V → V . T is one-to-one if and only if T is onto.
(F) (T) (F)
(10) The empty set ∅ is a basis for the zero space {0}.
(T)
(11) {(2, −4, 1, 0)T , (0, 3, −1, 0)T , (6, 0, −1, 0)T } can be expanded to a minimal spanning set of R4 .
(F)
(12) If {v1 , . . . , vm } is a spanning set of a V , then a basis can be constructed by discarding those vi which are linear combinations of vi+1 , . . . , vm .
(T)
(13) R3 is the direct sum of the x-axis and the solution set of x + y + z = 0.
(T)
(14) Mn×n (R) is a direct sum of the subspace of upper triangular matrices and the subspace of lower triangular matrices.
(F)
(15) Mn×n (R) is a direct sum of SMn×n (R) and ASMn×n (R).
(T)
(16) If A is a matrix and rank(A) = 0, then A = 0.
(T)
(17) If A, B and AB are symmetric matrices, then AB = BA.
(T)
(18) If A ∈ Mn×n (R) and Col(A) = Row(A), then A is symmetric.
(F)
(19) Every n × n matrix satisfies a polynomial of degree n2 − 1.
(F)
(20) Every subspace of Rn is a solution set of some linear system.
(T)
(21) There exists T : R2 → R2 such that T 2 6= 0 but T 3 = 0.
(F)
(22) There exists T : R2 → R2 such that T 2 is the reflection about the x-axis.
(F)
(23) There exists T : R3 → R3 such that T 2 is the reflection about the xy-plane.
(F)
(24) If A is an n × n matrix, then dim(Nul An ) = dim(Nul An+1 ) = dim(Nul An+2 ) = ....
(T)
(25) If A ∈ Mm×n (R), then rank(AT A) = rank(A).
(T)
2
Reasons. (1) The zero matrix is not invertible. (2) Counter-example: In M2×2 (R), 0 1
2 0 0 = 0 0
0 , 0
2 1 0 = 0 0
0 0
0 , 0
so they are nilpotent. However their sum is invertible and 0 1
2 1 = I2 , 0
so the set is not closed under vector addition. (3) Counter-example: Notice that every matrix commutes with the zero matrix. In M2×2 (R), we have 0 0 0 1 0 0 0 1 0 0 1 0 = and = . 1 0 0 0 0 1 0 0 1 0 0 0 Therefore,
0 0
1 0 , 0 0
0 0
and
0 0
1 0 , 0 1
0 0
0 0 , 0 1
0 0
are in the set, but their sum
0 0
is not. (4) (1, 2, 3, 4, . . . ) lies in the set but −1 · (1, 2, 3, 4, . . . ) = (−1, −2, −3, −4, . . . ) does not, so the set is not closed under scalar multiplication. (5) x2 + x and −x2 have degree 2 but (x2 + x) + (−x2 ) = x has degree 1, so the set is not closed under vector addition. (6) It is isomorphic to the subspace of C[0, π] that consists of functions f satisfying f (0) = f (π), so it is infinite-dimensional. (7) The kernel consists of functions which have derivatives identically zero, so it is just the set of constant functions. The kernel is spanned by the function that is identically 1, so the kernel is one-dimensional. (8) {x2 , x3 , x4 , . . . } is a linearly independent sequence in the kernel, so the kernel is infinite-dimensional by Problem 8 in Tutorial Note 6. (9) (The statement is true only when V is finite-dimensional.) Counter-example: multiplication by x is an one-to-one linear transformation from P(R) to P(R), but it is not onto. (10) As mentioned in Tutorial 6, this is the exceptional case handled by a separate definition. (11) The vectors are linearly dependent. (12) This is just the spanning set theorem with vectors put in another order. (13) From the geometry viewpoint this is obvious.
3
(14) Recall the definition of U being direct sum of subspaces V and W is (i) V +W = U , and (ii) V ∩ W = {0}. Since the intersection of the subspaces in the problem is the set of diagonal matrices (in particular it contains the identity matrix In ), so condition (ii) is violated. (15) Mn×n (R) is the sum of SMn×n (R) and ASMn×n (R) as shown in one of the homework problems. In fact, every matrix A is equal 12 (A + AT ) + 12 (A − AT ) and 1 1 T T 2 (A + A ) ∈ SMn×n (R) and 2 (A − A ) ∈ ASMn×n (R). To see SMn×n (R) ∩ ASMn×n (R) = {0} (0 is the zero matrix), suppose a matrix A ∈ Mn×n (R) is both symmetric and anti-symmetric. Then A = AT = −A so A = 0. (16) Suppose on the contrary that A 6= 0, i.e. some entry of A is nonzero. If a is a column vector of A that contains a nonzero entry, then a = 6 0 and the column space of A contains span(a) 6= {0}, so Col A is at least one-dimensional, hence rank(A) ≥ 1, a contradiction. (17) AB = (AB)T = B T AT = BA. (18) Counter-example: 0 0 1
1 0 0
0 1 . 0
(19) Counter-example: The 1 × 1 matrix [1] does not satisfy a polynomial of degree 0. See also Problem 3 in Tutorial Note 6. (20) Let V be a subspace of Rn and B = (v1 , . . . , vm ) be a basis of V . Extend B to a basis C = (v1 , . . . , vm , w1 , . . . , wn−m ) for Rn . Define the linear transformation T : Rn → Rn−m by T (vi ) = 0 and T (wj ) = ej . Let A be the matrix corresponding to T (with respected to the standard bases on Rn and Rn−m ). Then V is the solution set of the linear system Ax = 0. (There is more than one possibility.) (21) It was proved in the lecture that if A ∈ Mn×n (R), then rank(An ) = rank(An+1 ) = rank(An+2 ) = . . . . Suppose on the contrary that such T exists, then the matrix A corresponding to T (with respected to the standard basis) is 2 × 2. Hence rank(A2 ) = rank(A3 ), a contradiction. (22) Suppose on the contrary that there exists such T . Let A be the matrix corresponding to T (with respected to the standard basis). The matrix corresponding to the reflection about the x-axis (with respected to the standard basis) is 1 0 B= . 0 −1 Then we have A2 = B. However, if we take determinant on both sides, we have 0 ≤ (det A)2 = det(A2 ) = det(B) = −1, a contradiction. (23) The matrix corresponding to the reflection about the xy-plane (with respected to the standard basis) is 1 0 0 0 . B0 = 0 1 0 0 −1 Replace B = B 0 in the the last problem to get a contradiction. (24) Use again the result that if A ∈ Mn×n (R), then rank(An ) = rank(An+1 ) = rank(An+2 ) = . . . . By the Rank-Nullity Theorem, dim(Nul Ai ) = n − rank(Ai ), so dim(Nul An ) = dim(Nul An+1 ) = dim(Nul An+2 ) = . . . . 4
(25) I prove that ker(AT A) = ker(A). First notice that if x = (x1 , . . . , xn )T ∈ Rn and xT x = 0, then x21 + · · · + x2n = 0 hence x1 = · · · = xn = 0, i.e. x = 0. Clearly ker(A) ⊆ ker(AT A). To show ker(AT A) ⊆ ker(A), suppose x ∈ ker(AT A). Then AT Ax = 0 =⇒ (Ax)T Ax = xT AT Ax = 0 =⇒ Ax = 0, so x ∈ ker(A). Now by the Rank-Nullity Theorem, rank(AT A) = n − dim(Nul AT A) = n − dim(Nul A) = rank(A).
5
Problem 2. Let V and W be vector spaces and f : V → W be a function. Show that the conditions (1) and (2) are equivalent: (1)
(i) f (v1 + v2 ) = f (v1 ) + f (v2 ), (ii) f (cv) = cf (v), for all v1 , v2 ∈ V and c ∈ R.
(2) f (av1 + bv2 ) = af (v1 ) + bf (v2 ), for all v1 , v2 ∈ V and a, b ∈ R. Justify each step by writing down • “axiom” if you used an axiom of vector space, or • “property” if you used a derived property of vector space, or • “(1)(i)”, “(1)(ii)” or “(2)” if you used an assumption above. Do not use more than one justification in one step.
[10]
Solution. To show (1) and (2) are equivalent, we have to prove two directions, (1) =⇒ (2) and (2) =⇒ (1). (1) =⇒ (2): f (av1 + bv2 )
= f (av1 ) + f (bv2 )
by (1)(i)
= af (v1 ) + bf (v2 )
by (1)(ii)
(2) =⇒ (1)(i): f (v1 + v2 )
=
f (1 · v1 + 1 · v2 )
by axiom 10
=
1 · f (v1 ) + 1 · f (v2 )
by (2)
=
f (v1 ) + f (v2 )
by axiom 10
(2) =⇒ (1)(ii): f (cv)
= f (cv + 0)
by axiom 4
= f (cv + 0 · 0)
by the property 0 · v = 0
= cf (v) + 0 · f (0)
by (2)
= cf (v) + 0
by the property 0 · v = 0
= cf (v)
by axiom 4
6
Problem 3. Suppose V and W are subspaces of a vector space U such that V ∪ W is also a subspace. Prove that V ⊆ W or W ⊆ V . [15] Solution. Suppose on the contrary that V * W and W * V . Then there exists v ∈ V and w ∈ W such that v 6∈ W and w 6∈ V . Since v, w ∈ V ∪ W and V ∪ W is a subspace, v + w ∈ V ∪ W , which means v + w ∈ V or v + w ∈ W . If v + w ∈ V , then w = (v + w) − v ∈ V , a contradiction. If v + w ∈ W , then v = (v + w) − w ∈ W , a contradiction.
7
Problem 4. Let f1 , f2 , . . . , fn : R → R be n − 1 times differentiable functions. (a) Prove that if f1 (x0 ) f2 (x0 ) ··· 0 f10 (x0 ) f (x ) ··· 0 2 .. .. .. . . . (n−1) (n−1) f (x0 ) f2 (x0 ) · · · 1
6= 0, (n−1) fn (x0 ) fn (x0 ) fn0 (x0 ) .. .
for some x0 ∈ R, then {f1 , f2 , . . . , fn } is linear independent in C 1 (R). (b) Use this result to show that {x, xex , x2 ex } is linearly independent.
[10] [10]
(c) Calculate the determinant above for {x2 , x|x|}. Is it linearly independent in C 1 (R)? In C 1 (a, b) for some a < b? (Drawing the graphs will help you.)
[5]
Solution. (a) Suppose c1 f1 + · · · + cn fn = 0 (the zero function). Differentiate both sides n − 1 times we get c1 f1 + · · · + cn fn = 0 c1 f10 + · · · + cn fn0 = 0 .. . (n−1) (n−1) =0 c1 f1 + · · · + cn fn Plug in x0 ,
f1 (x0 ) f10 (x0 ) .. . (n−1)
f1
f2 (x0 ) f20 (x0 ) .. . (n−1)
(x0 ) f2
··· ··· .. .
(x0 ) · · ·
fn (x0 ) fn0 (x0 ) .. . (n−1)
fn
c1 0 c2 0 .. = .. . . .
(x0 )
cn
0
Since the coefficient matrix is invertible, it has only the trivial solution c1 = · · · = cn = 0. (b) The determinant is x 1 0
xex ex + xex 2ex + xex
x2 ex 2xex + x2 ex 2ex + 4xex + x2 e2
Plug in x = 1, the determinant becomes 1 e e 1 1 2e 3e = 0 0 3e 7e 0
e e 3e
e 2e 7e
.
= e2 6= 0.
By part (a), {x, xex , x2 ex } is linearly independent. (c) Note that (x|x|)0 = 2|x| (it is also differentiable at x = 0 and the derivative is 0). The determinant is 2 x x|x| 2x 2|x| = 0. for all x ∈ R. {x2 , x|x|} is linearly independent in C 1 (R) since they are not multiple of each other. 8
Note that
( x2 if x ≥ 0, x|x| = −x2 if x < 0.
(You are suggested to draw the graphs of both functions.) {x2 , x|x|} is linearly independent in C 1 (a, b) as long as the interval (a, b) contains 0, i.e. a < 0 < b, since they are not multiple of each other in this case. {x2 , x|x|} is linearly dependent in C 1 (a, b) if a < b ≤ 0 or 0 ≤ a < b. In these cases x2 and x|x| are the same function or negative to each other on (a, b).
The End.
9