If we're given a set two vectors from $\mathbb{R}^4$, for example: $$S=\{(3,-1,1,1),(1,3,-1,1)\}$$ and we want to check if they're linearly dependent or independent, is the following procedure a correct way to do this?
We form a matrix consisting of those two vectors like this: \begin{bmatrix} 3 & -1 & 1 & 1 \\ 1 & 3 & -1 & 1 \\ \end{bmatrix} Then we transform that matrix to row echelon form by $II-3\cdot I$ transformation: \begin{bmatrix} 3 & -1 & 1 & 1 \\ 0 & \frac{10}{3} & -\frac{4}{3} & \frac{2}{3} \\ \end{bmatrix} We can see that the rank of that matrix is $2$, so it has two linearly independent rows. Does that answer the question of linear independence of those two vectors? The reason I'm asking this is because my professor checked their linear independence by forming a matrix of this form: \begin{bmatrix} 3 & 1 \\ -1 & 3 \\ 1 & -1 \\ 1 & 1 \\ \end{bmatrix} Then reduced it to row echelon form: \begin{bmatrix} 3 & 1 \\ 0 & \frac{10}{3} \\ 0 & 0 \\ 0 & 0 \\ \end{bmatrix} which is equal to the system: $$3x+y=0$$ $$\frac{10}{3}y=0$$ so $x=0$ and $y=0$, which means those two vectors are independent.
$\endgroup$ 13 Answers
$\begingroup$Both methods are correct. To check if two vectors $a$ and $b$ are linearly independent, just check if $b$ is a scalar multiple of $a$. If the 4th component of both the vectors are equal, then in order for the two vectors to be linearly dependent all components must be equal. Hence, the two given vectors are linearly independent.
Alternatively, we can form a $2 \times 4$ matrix whose rows are $a$ and $b$. If the rank of this matrix is 2, then the two vectors are linearly independent. The rank of this matrix can be obtained by doing row operations, like you did. But the rank can also be obtained by doing column operations (which is equivalent to doing row operations on the transpose matrix, which is what your professor did). Both row and column operations preserve the rank of a matrix, and the maximum number of linearly independent rows of a matrix is equal to the maximum number of linearly independent columns of a matrix. Whichever method you find simpler for the problem at hand should be fine.
$\endgroup$ $\begingroup$I think the fastest way would be to do it by contradiction.
Let $X=(3,-1,1,1)$ and $Y=(1,3,-1,1)$
Assume $X$ and $Y$ are linearly dependent. Then there is $a$ such that
$X= aY$.
Using this on the first coordinate, we get $3 = a$. But then on the second coordinate we get $-1 = 3a$ so $a = -\frac{1}{3} \neq 3$. Contradiction.
$\endgroup$ $\begingroup$This is a consequence of the result that the row rank of a matrix equals its column rank. Let $ T : V \to W $ be a linear map. Then, the column rank of $ T $ is the dimension of its image. Put $ T $ into matrix form by fixing bases for $ V $ and $ W $. Now, the kernel of $ T $ consists of solutions of $ Tx = 0 $, and by putting $ T $ into reduced row echelon form we see that the dimension of the kernel is equal to the number of non-pivot columns in the reduced row echelon form of $ T $. By definition, the row rank of $ T $ is the number of pivot columns, and since every column is either pivot or non-pivot, we have that $ \textrm{row rank} + \dim \ker T = \dim V $.
On the other hand, the rank-nullity theorem tells us that
$$ \dim \operatorname{Im} T + \dim \ker T = \dim V $$
so it follows that $ \textrm{row rank} = \dim \operatorname{Im} T = \textrm{column rank} $.
Now, the column rank of a matrix $ A $ is the row rank of its transpose $ A^T $, but by the above result this is also the column rank of $ A^T $. Therefore, a matrix and its transpose have the same rank. This means that what you did and what your professor did are equivalent.
$\endgroup$