Glam Prestige Journal

Bright entertainment trends with youth appeal.

$\begingroup$

I was working through a problem and was wondering if there was an easier way of finding the basis of the left null space of a given matrix.

(For a simple example) Suppose we have a matrix $A = \begin{bmatrix} 1 & 2 & 4 \\ 2 & 4 & 8 \end{bmatrix}$ when reduced we can write it as $\text{rref}(A) = \begin{bmatrix} 1 & 2 & 4 \\ 0 & 0 & 0 \end{bmatrix} $

from rref(A) it is clear that:

Basis for $C(A) = \left\{ \begin{pmatrix} 1 \\ 2\end{pmatrix} \right\}$

Basis for $C(A^T) = \left\{ \begin{pmatrix} 1,&2, & 4 \end{pmatrix} \right\}$

Basis for $N(A) = \left\{ \begin{pmatrix} -2 \\ 1 \\ 0\end{pmatrix} , \begin{pmatrix} -4 \\ 0 \\ 1 \end{pmatrix}\right\}$

Now my question is am I able to deduce the left null space just from rref(A)?

Else, I would take the transpose of A, row reduce it and then find the left null space that way but I was wondering if there is an easier way?

$\endgroup$

3 Answers

$\begingroup$

You can’t really get the left null space directly from just the rref, but if you first augment the matrix with the appropriately-sized identity and then row-reduce it, the row vectors to the right of the zero rows of the rref constitute a basis for the left null space.

Using your example, row-reduce $$\left[\begin{array}{ccc|cc}1&2&4 & 1&0 \\ 2&4&8 & 0&1 \end{array}\right] \to \left[\begin{array}{ccc|rc} 1&2&4 & 1 &0 \\ 0&0&0 & -2&1 \end{array}\right].$$ The left null space is thus $\operatorname{span}\{(-2,1)\}$.

As for why this works, see this question. I’ll repeat a caveat from there: this method doesn’t often give you a “nice” basis, in that the vectors are often rather large multiples of what you would’ve computed by the more usual method of applying Gaussian elimination to the transpose.

$\endgroup$ $\begingroup$

In general it's not possible since row operations preserve the solutions of $Ax=0$ but do not preserve the solutions of $x^TA=0$. Thus you have to refer to the original matrix and operate by columns.

In this simple example, you can consider the pivot column of the original matrix $(1,2)$ and find left null space from it (from rref you only know that its dimension is 1).

$\endgroup$ 1 $\begingroup$

You know the usual way to find the basis for the left nullspace, $N(A^T)$. First you transpose the matrix A, then you do row elimination to find pivot columns and free columns. Then you set free variables, then find pivot variables. It's the same way to find the null space of A.

For easier method, I recommend "observation" directly. The left nullspace means you have some combinations of rows of A, which the outcome is zero. Given your matrix, it is very easy to see that $(-2, 1)$ works. The -2*first row + the second row = 0.

I'm not writing funny things here, though I admit "observation" directly sounds like a silly method. But in most of linear algebra exercises, most of matrix are m*n with $m<n$. For example, a 3 by 5 matrix.It's hard to imagine how to combine 5 vectors. But it's much easier to find a way to combine 3 vectors in seconds.

I guess your easier method is:enter image description here

$\endgroup$

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy