Glam Prestige Journal

Bright entertainment trends with youth appeal.

$\begingroup$

For a given $n \times n$-matrix $A$, and $J\subseteq\{1,...,n\}$ let us denote by $A[J]$ its principal minor formed by the columns and rows with indices from $J$.

If the characteristic polynomial of $A$ is $x^n+a_{n-1}x^{n-1}+\cdots+a_1x+a_0$, then why $$a_k=(-1)^{n-k}\sum_{|J|=n-k}A[J],$$ that is, why is each coefficient the sum of the appropriately sized principal minors of $A$?

$\endgroup$ 4

4 Answers

$\begingroup$

Use the fact that $\begin{vmatrix} a & b+e \\ c & d+f \end{vmatrix} = \begin{vmatrix} a & b \\ c & d \end{vmatrix} + \begin{vmatrix} a & e \\ c & f \end{vmatrix} $

We can use this fact to separate out powers of $\lambda$. Following is an example for $2 \times 2$ matrix. $$ \begin{vmatrix} a-\lambda & b \\ c & d-\lambda \end{vmatrix} = \begin{vmatrix} a & b \\ c & d-\lambda \end{vmatrix} + \begin{vmatrix} -\lambda & b \\ 0 & d-\lambda \end{vmatrix} = \begin{vmatrix} a & b \\ c & d \end{vmatrix} + %% \begin{vmatrix} a & 0 \\ c & -\lambda \end{vmatrix} + %% \begin{vmatrix} -\lambda & b \\ 0 & d \end{vmatrix} + \begin{vmatrix} -\lambda & 0 \\ 0 & -\lambda \end{vmatrix} $$

This decompose $det$ expression into sum of various powers of $\lambda$.

Now try it with a $3 \times 3$ matrix and then generalize it.

$\endgroup$ 2 $\begingroup$

One way to see it: $A:V\to V$ induces the (again linear) maps $\wedge^k A:\wedge^k V\to \wedge^k V$. Your formula (restated in an invariant way, i.e. independently of basis) says that $$\det(xI-A)=x^n-x^{n-1}\operatorname{Tr}(A)+ x^{n-2}\operatorname{Tr}(\wedge^2 A)-\cdots(*)$$ We can conjugate $A$ so that it becomes upper-triangular with diagonal elements $\lambda_i$ ($\lambda_i$'s are the roots of the char. polynomial). Now for upper triangular matrices the formula $(*)$ says that $$(x-\lambda_1)\cdots(x-\lambda_n)=x^n-x^{n-1}(\sum\lambda_i)+x^{n-2}(\sum\lambda_i\lambda_j)-\cdots$$ which is certainly true, hence $(*)$ is true.

$\endgroup$ 2 $\begingroup$

$\newcommand\sgn{\operatorname{sgn}}$I learned of the following proof from @J_P's answer to what effectively is the same question. It arises from expanding the usual definition $\det A=\sum_{\sigma\in S_n}\sgn\sigma\prod_{1\le k\le n}A_{k,\sigma(k)}$, and deserves to be more well-known than it currently is.

Let $[n]:=\{1,\dots,n\}$, and write $\delta_{i,j}$ for the Kronecker delta, which is equal to $1$ if $i=j$, and is $0$ otherwise. Note that $\prod_{1\le k\le n}(a_k+b_k)=\sum_{C\subseteq[n]}\prod_{i\in C}a_i\prod_{j\in[n]-C}b_j$, since every term in the expansion on the left hand side will choose from each expression $(a_k+b_k)$ either $a_k$ or $b_k$, and so we may sum over all possible ways $C$ of choosing the $a_k$ terms. We compute\begin{align*} \det(tI-A) &=\sum_{\sigma\in S_n}\sgn\sigma\prod_{1\le k\le n} (t\delta_{k,\sigma(k)}-A_{k,\sigma(k)})\\ &=\sum_{\sigma\in S_n}\sgn\sigma\sum_{C\subseteq[n]} \prod_{i\in C}(-A_{i,\sigma(i)})\prod_{j\in[n]-C}t\delta_{j,\sigma(j)}\\ &=\sum_{C\subseteq[n]}(-1)^{|C|}\sum_{\sigma\in S_n}\sgn\sigma \prod_{i\in C}A_{i,\sigma(i)}\prod_{j\in[n]-C}t\delta_{j,\sigma(j)}. \end{align*}For fixed $C\subseteq[n]$ and $\sigma\in S_n$, the last product $\prod_{j\in[n]-C}t\delta_{j,\sigma(j)}$ vanishes unless $\sigma$ fixes the elements of $[n]-C$, in which case the product is just $t^{n-|C|}$. So we need only consider the contributions of the permutations of $C$ in our sum, by thinking of a permutation $\sigma\in S_n$ that fixes $[n]-C$ as a permutation in $S_C$. The sign of this permutation considered as an element of $S_C$ remains the same, as can be seen if we consider the sign as $(-1)^{T(\sigma)}$, where $T(\sigma)$ is the number of transpositions of $\sigma$. We thus have\begin{align*} \sum_{C\subseteq[n]}(-1)^{|C|}\sum_{\sigma\in S_n}\sgn\sigma \prod_{i\in C}A_{i,\sigma(i)}\prod_{j\in[n]-C}t\delta_{j,\sigma(j)} &=\sum_{C\subseteq[n]}(-1)^{|C|}\sum_{\sigma\in S_C}\sgn\sigma \prod_{i\in C}A_{i,\sigma(i)}t^{n-|C|}\\ &=\sum_{C\subseteq[n]}(-1)^{|C|}t^{n-|C|}\sum_{\sigma\in S_C}\sgn\sigma \prod_{i\in C}A_{i,\sigma(i)}. \end{align*}The term $\sum_{\sigma\in S_C}\sgn\sigma\prod_{i\in C}A_{i,\sigma(i)}$ is precisely the determinant of the principal submatrix $A_{C\times C}$, which is the $|C|\times|C|$ matrix with rows and columns indexed by $C$, and so\begin{align*} \sum_{C\subseteq[n]}(-1)^{|C|}t^{n-|C|}\sum_{\sigma\in S_C}\sgn\sigma \prod_{i\in C}A_{i,\sigma(i)} &=\sum_{C\subseteq[n]}(-1)^{|C|}t^{n-|C|}\det(A_{C\times C})\\ &=\sum_{0\le k\le n}\sum_{\substack{C\subseteq[n]\\|C|=k}}(-1)^kt^{n-k} \det(A_{C\times C})\\ &=\sum_{0\le k\le n}t^{n-k}\left((-1)^k \sum_{\substack{C\subseteq[n]\\|C|=k}} \det(A_{C\times C})\right)\\ &=\sum_{0\le k\le n}t^k\left((-1)^{n-k} \sum_{\substack{C\subseteq[n]\\|C|=n-k}} \det(A_{C\times C})\right). \end{align*}

$\endgroup$ 1 $\begingroup$

Here's another way by using Taylor's theorem.

Consider $\det (xI+A)$ as a polynomial $p(x)$, from Taylor's theorem we have that: $$ p(x)=\sum_{i=0}^n\frac{p^{(i)}(0)}{i!}x^i. $$ Computing $p^{(i)}(0)$ will leads quikly to the conclusion.


How to compute $p^{(i)}(x)$ at $x=0$ ? Well, here's a trick:

For instance we compute $p'(0)$, go back to the determinant and replace the $x$ in the $k$th row by $x_k$, and using the total derivative. Then you'll find: $$p'(0)=\sum_{|J|=n-1}A[J].$$

And using induction we can show in general that: $$p^{(i)}(0)=i!\sum_{|J|=n-i}A[J]$$

$\endgroup$ 0

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy