• if you use an online LATEX editor: as nobody can predict whether the server of the online
editor will work when you’ll really need it (for instance just before the deadline), I warmly
advise you to have a local copy of your .tex files.
• in any case, you will have to submit a .pdf file, not the .tex file.
Problem 1 – 4 points
Let F : R
3 → R
2 defined by
F(~u) =
1 2 3
−1 −2 −1
~u
Question 1: Give a basis for Im(F).
ANSWER: Let
~u1 =
1
−1
, ~u2 =
2
−2
, and ~u3 =
3
−1
1
The family h~u1, ~u2, ~u3i spans Im(F). Since ~u2 = 2~u1, the family h~u1, ~u3i also spans Im(F). We
finaly show that the latter family is linearly independent, i.e. that
a~u1 + b~u3 = ~0 ⇒ a = b = 0
Suppose a, b ∈ R are such that a~u1 + b~u3 = ~0. Thus
a + 3b = 0
−a − b = 0 which rewrites
2b = 0
a = −b
and a = b = 0
The family h~u1, ~u3i spans Im(F) and is linearly independent: it is a basis of Im(F).
Question 2: Give a basis for Null(F).
ANSWER: Notice first that
1 2 3
−1 −2 −1
2
−1
0
= ~0
which means that ~v = h2, −1, 0i is in Null(F).
Now, from the rank nullity theorem, one has Dim(Null(F)) = 3 − 2 = 1, thus the family h~vi is a
basis for Null(F).
Problem 2 – 4 points
A square n × n matrix M is non-singular if Rank(M) = n.
Question 1: Let A be a non-singular n × n matrix.
Show that A is non-singular if and only if A~x = ~0 ⇒ ~x = ~0.
ANSWER: Recall that A is associated with a linear map F : R
n → R
n
.
Suppose that A is non-singular: Rank(A) = n. From the rank-nullity theorem, Dim(Null(A)) = 0
and Null(A) = {~0}, i.e. A~x = ~0 ⇒ ~x = ~0.
Conversely, suppose that A~x = ~0 ⇒ ~x = ~0. Then ~0 is the only vector in Null(A) and Dim(Null(A)) =
- From the rank-nullity theorem, Rank(A) = n − 0 = n.
Question 2: Let A and B be two n × n non-singular matrices.
Prove that AB is non-singular.
ANSWER: We show that AB~x = ~0 ⇒ ~x = ~0 and question 1 implies that AB is non-singular.
Let ~x be such that AB~x = ~0.
Then B~x = ~0 from question 1 since A is non-singular,
and ~x = ~0 from question 1 since B is non-singular.
Problem 3 – 4 points
Let F : R
n → R
n be the linear map with n × n matrix M of rank n (i.e. M is non-singular). In
this problem, we will prove that there exist a matrix N so that MN = NM = In.
2
Question 1: Let ~b in R
n
. Show that there is a unique ~x in R
n
so that M~x = ~b.
ANSWER: Since Rank(M) = n, Im(F) = R
n
, and for any ~b in R
n
, there is a ~x in R
n
so that
M~x = ~b. Suppose ~x is not unique: there exist ~y ∈ R
n
, ~y 6= ~x, so that M~y = ~x. Then M~y−M~x = ~0
and ~y−~x 6= ~0 is in Null(F), then Dim(Null(F)) ≥ 1 what is in contradiction with the rank nullity
theorem.
Question 2: Let G : R
n → R
n be the map defined as
G(
~b) = ~x where ~x is such that M~x = ~b
(G is well defined because of the existence and unicity of ~x for a given ~b).
Show that G is a linear map.
ANSWER: Let ~b,~c be two vectors in R
n
, and s ∈ R.
Let ~x = G(
~b) and ~y = G(~c). One has M~x + M~y = ~b + ~c, thus M(~x + ~y) = ~b + ~c, and G(
~b + ~c) =
~x + ~y = G(
~b) + G(~c).
Now, sG(
~b) = s~x, and one has Ms~x = sM~x = s
~b, thus G(s
~b) = s~x, and G is linear.
Question 3: Show that for any ~x ∈ R
n
, G(F(~x)) = F(G(~x)) = ~x.
ANSWER: We first show that for any ~x ∈ R
n
, G(F(~x)) = ~x. Let ~b = F(~x) = M~x. G(
~b) is the
unique vector ~x in R
n
so that M~x = ~b, thus G(F(~x)) = ~x.
We show now that for any ~x ∈ R
n
, F(G(~x)) = ~x. Let ~b = G(~x). One has M~b = ~x. As a
consequence, F(G(~x)) = M~b = ~x.
Question 4: Show that their exist a unique n × n matrix N so that MN = NM = In.
ANSWER: G is associated with a unique n × n matrix N.
From previous question, one has ∀~x ∈ R
n
, MN~x = NM~x = ~x what implies:
MN = NM = In
Problem 4 – 4 points
Let M be an n × n matrix. We say that M is an orthogonal matrix if it’s columns have norm
1 and are pairwise orthogonal, i.e. ∀i, M[:, i] • M[:, i] = 1 and ∀i 6= j, M[:, i] • M[:, j] = 0.
Question 1: Show that if M is orthogonal, then MtM = In.
ANSWER: Let A = MtM, and 1 ≤ i, j ≤ n.
One has A[i, j] = Mt
[i, :]
• M[:, j] = M[:, i] • M[:, j]. Since the columns of M are an orthonormal
family, one has M[:, i] • M[:, j] = 1 if and only if i = j and M[:, i] • M[:, j] = 0 if and only if i 6= j.
Thus A[i, j] = 1 if and only if i = j and A[i, j] = 0 if and only if i 6= j, and A = In.
Question 2: Show that if M is orthogonal, then MMt = In.
hint: In order to prove that MMt = In, you can prove that ∀~x ∈ R
n
, one has MMt~x = ~x.
ANSWER: We show that ∀~x ∈ R
n
, MMt~x = ~x.
Let ~x ∈ R
n
. Since M is orthogonal it has full rank, and there exist ~y ∈ R
n
so that ~x = M~y.
Then MMt~x = MMtM~y = M(MtM)~y = M~y = ~x (from question 1).
3
Problem 5 – 4 points
Let A be an n × n upper triangular matrix. In this exercise we will prove that A is non-singular
if and only if it has no 0 element on its diagonal.
Question 1: Suppose that A has at least one 0 diagonal element, and show that A is singular.
ANSWER: Let i be the smallest index so that A[i, i] = 0.
If i = 1 then first columns of A is ~0 and the family formed by columns of A is not linearly
independent and A is singular.
Let i > 1. Let
~u1 = A[1 : i − 1, 1]
~u2 = A[1 : i − 1, 2]
. . . = . . .
~ui = A[1 : i − 1, i]
i.e. for 1 ≤ j ≤ i, ~uj ∈ R
i−1
is equal to the i − 1 first coordinates of the j-th column of A. The
family h~u1, . . . , ~uii contains i vectors of dimension i − 1: it is necessarily linearly dependent. So is
the family Span(hA[:, 1], . . . , A[:, i − 1]i) and A is singular.
Question 2: Suppose that A has no 0 diagonal element, and show that A is non singular.
ANSWER: For this, we first prove that for any i, ~e i
is in the span of hA[:, 1], . . . , A[:, i]i:
Notice first that ~e 1 =
1
A[1,1]A[:, 1] thus ~e 1
is in the span of hA[:, 1]i. Suppose now that ∀j < i, ~e j
is in the span of hA[:, 1], . . . , A[:, j]i. Since
~e i =
1
A[i, i]
(A[:, i] −
1
A[1, i]
~e 1 − . . . −
1
A[1, i − 1]~e i−1
),
~e i
is in the span of hA[:, 1], . . . , A[:, i]i, and this is true for any i ≤ n.
Thus hA[:, 1], . . . , A[:, n]i spans R
n and has n vectors; it is a basis of R
n and the family
hA[:, 1], . . . , A[:, n]i is linearly independant, i.e. A is non-singular.
4