Does a matrix need to be a square ($3\times 3$, $4\times 4$, etc.) to be linearly Independent? Or do the columns just have to be $\geq$ the rows?
$\endgroup$ 33 Answers
$\begingroup$First, you can refer to rows or columns of a matrix being "linearly independent" but not really the matrix itself.
Now if the rows and columns are linearly independent, then your matrix is non-singular (i.e. invertible). Conversely, if your matrix is non-singular, it's rows (and columns) are linearly independent.
Matrices only have inverses when they are square. This is related to the fact you hint at in your question.
If you have more rows than columns, your rows must be linearly dependent. Likewise, if you have more columns than rows, your columns must be linearly dependent. This means that if you want both your rows and your columns to be linearly independent, there must be an equal number of rows and columns (i.e. a square matrix).
$\endgroup$ 4 $\begingroup$Linear (in)dependence is a property of a set of vectors in some given vector space, and so one cannot speak of linear (in)dependence of a matrix.
On the other hand, one often forms a matrix by adjoining (shunting together) several column vectors, and conversely given a matrix we regard each of its columns as a column vector, and so we can ask about the linear independence of these vectors so produced:
Given an $m \times n$ matrix $A$ (say, over the field $\Bbb F$), we get a set of $n$ vectors of size $m \times 1$, that is, in $\Bbb F^m$. The maximum size of any linearly independent set of vectors in $\Bbb F^m$ is $\dim_{\Bbb F} (\Bbb F^m) = m$, so if the columns of $A$ are linearly independent, we must have $n \leq m$, that is, at least as many rows than columns. Of course, the converse is false, as the example $A = 0$ shows. On the other hand, the example $A = I_n$ shows that this bound is sharp, that is, the columns of a square matrix can be linearly independent (in fact, this is generically true).
$\endgroup$ 0 $\begingroup$You can think of a matrix as a collection of column vectors. You can think of each of these column vectors as "unlocking a new dimension in space"
Consider the matrix
$$\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$$
The first column unlocks the x component. The second column unlocks the y component. The third column unlocks the z component.
Consider a matrix to be $m x n$ where m is the number of rows and n is the number of columns. In our $3x3$ example, you have 3 vectors in 3D space. If you added another column to the matrix, you would be adding another column vector. This new vector wouldn't unlock any new dimensions since you are restricted by the number of rows. On the other hand, consider removing a column. Neither the first column nor the second column are redundant because they both contribute to unlocking new dimensions.
In conclusion if $m \lt n$, then we are guaranteed to not be linearly independent. If $m \geq n$, we may or may not be linearly independent depending on the vectors themselves.
$\endgroup$