The Determinate of the Anti-Diagonal Matrix

Let us define the “anti-diagonal” matrix of dimension k, Jk, as matrix with ones on the anti-diagonal and zeros everywhere else. For instance, the first few matrices are:

The question is to find the determinate of Jk. It’s a straight forward question, but I am writing a blog post on it because it is interesting in that it allows a few different approaches to easily be taken, and so it is interesting to see which you select. This post will document five solutions to the problem.

Solution 1: Calculation via minors

The usual way of actually calculating the determinate of matrices is to use ‘minors‘. Let us use the minors expansion in the first column. All the terms in the first column are zero except for the bottom one, and so you get that:

|J_k| = (-1)^{k+1} |J_{k-1}|

With the extra condition that J1 = 1, it follows that:

|J_k| = (-1)^{ (k+1) + k + (k-1) + \cdots + 2} = (-1)^{(k-1)(k+4)/2}

In order to bring it to the same form as the following solutions, we note that if k = 2r + s then

(-1)^{(k-1)(k+4)/2} = (-1)^{(r+3s+s^2)/2} = (-1)^r where s is either zero or one.

Solution 2: Using Eigenvalues and the trace

This method is particularly elegant.

Note that as J_k^2 = I and so all Eigenvalues \lambda of Jk have \lambda^2 = 1, and thus all the Eigenvalues are either 1 or -1.

Let Tr(A) be the trace of a matrix A. We have that Tr(Jk) = 0 when k is even, and Tr(Jk) = 1 when k is odd. The trace is the sum of the Eigenvalues, and so in either of the cases k = 2r or k = 2r + 1, we must have r pairs of -1 and 1 Eigenvalues. (In the odd case we have an addition 1 Eigenvalue.) As the determinate is the product of the Eigenvalues, it must be |J_k| = (-1)^r.

I like this method the best, but if it had a flaw it would be that it doesn’t easily generalise to the case when the anti-diagonal elements are arbitrary, instead of being ones. However, this doesn’t really matter, as the properties of a determinate just mean you can ‘pull out’ the values anyway, and so it is just the product of the diagonal elements multiplied by |Jk|.

Solution 3: Via Swaps

The determinate is such that if you swap two adjacent columns (or rows), you reverse the sign. We also know that the determinate of the identity matrix is 1. Therefore it should be possible to turn Jinto I using a series of swaps.

Therefore the problem reduces to finding how many swaps can turn the sequence [k,k-1,…,2,1] into the sequence [1,2,…,k-1,k]. To solve this, note it takes k-1 swaps to ‘push the k to the right’, and another k-2 to push the 1 back to the left. Therefore after 2k-3 swaps you get  [k-1,k-2,…,1,k] . After that, you can apply the same result for k’ = k-2. Therefore we have that |Jk| = -|Jk-2|.

Given the base cases for k=1 and k=2, the result follows.

Solution 4: Via Permutations

This method is essentially the same as the above, but carried out in a different way.  We use the following definition of a determinate:

Using this definition to calculate |Jk|, we note that only one permutation contributes anything to the sum: The one which takes i to k-i. In this case, the determinate is just the sign of the aforementioned permutation.

The permutation swaps the first and last, second and second-last, etc. Therefore, if k = 2r, there are r permutations. If k = 2r + 1, there are still r permutations, as the central element (r+1) remains fixed. Hence the determinate of k is (-1) to the power r.

Solution 5: Directly via the Eigenvalues

Whilst this is a difference approach to Solution 4, it gives fairly similar calculations. The determinate of a matrix is the product of the Eigenvalues. Thus we can just determine the Eigenvalues.

Jswaps coordinate 1 with coordinate k. Therefore [1,0,…,0,1] is an Eigenvector with Eigenvalue 1. Also, [1,0,…,0,-1] is an Eigenvector which goes to negative itself, and so -1 is an Eigenvalue.

As above, there are r coordinates which are swapped when k = 2r or k = 2r + 1. Therefore the determinate is (-1) to the power r.

The solutions do have similarities, but they also have differences, and I think it is interesting the number of ways there are to approach this simple problem.

Leave a comment