3.3.6

By Proposition 1.2.18, a square matrix A is invertible if and only if the equation Ac = 0 admits only the trivial solution c = 0 (here c is an n-by-1 column vector). Thus, in order to show that A is invertible if and only if its columns are linearly independent, it suffices to show that the equation Ac = 0 admits only the trivial solution c = 0 if and only if the columns of a are linearly independent.

Suppose that the columns of A are linearly independent, and consider the equation Ac = 0. Note that we may write $$A = [A_1 \ A_2 \ ... \ A_n]$$, where $$A_i$$ denotes column i of A. Then, letting $$c = (c_1,c_2,...c_n)$$ be an n-by-1 column vector, we can write Ac = 0 as $$ c_1A_1 + ... +c_nA_n = 0 $$. But by assumption, the columns $$A_i$$ are linearly independent, so that $$c_i = 0$$ for every i, thus c is the zero vector, and the equation Ac = 0 admits only the trivial solution c = 0.

Conversely, suppose that the equation Ac = 0 admits only the trivial solution c = 0, then letting $$c_1,..c_n$$ be scalars such that $$c_1A_1 + ... c_nA_n = 0$$, we have that the vector $$c = (c_1,...,c_n)$$ satisfies the equation Ac = 0, thus c = 0 and hence $$c_i = 0$$ for all i, showing the columns of A are linearly independent. Therefore, the equation Ac = 0 admits only the trivial solution c = 0 if and only if the columns of A are linearly independent, and thus A is invertible if and only if its columns are linearly independent.