| |||||||||
IMC2015: Day 2, Problem 99. An $n \times n$ complex matrix $A$ is called \emph{t-normal} if $AA^t = A^t A$ where $A^t$ is the transpose of $A$. For each $n$, determine the maximum dimension of a linear space of complex $n \times n$ matrices consisting of t-normal matrices. Proposed by Shachar Carmeli, Weizmann Institute of Science Solution. Answer: The maximum dimension of such a space is $\frac{n(n + 1)}{2}$. The number $\frac{n(n + 1)}{2}$ can be achieved, for example the symmetric matrices are obviously t-normal and they form a linear space with dimension $\frac{n(n + 1)}{2}$. We shall show that this is the maximal possible dimension. Let $M_n$ denote the space of $n \times n$ complex matrices, let $S_n \subset M_n$ be the subspace of all symmetric matrices and let $A_n \subset M_n$ be the subspace of all anti-symmetric matrices, i.e. matrices $A$ for which $A^t = -A$. Let $V \subset M_n$ be a linear subspace consisting of t-normal matrices. We have to show that $\dim(V) \le \dim(S_n)$. Let $\pi : V \to S_n$ denote the linear map $\pi(A) = A + A^t$. We have \[ \dim(V) = \dim(\ker(\pi)) + \dim(\im(\pi)) \] so we have to prove that $\dim(\ker(\pi)) + \dim(\im(\pi)) \le \dim(S_n)$. Notice that $\ker(\pi) \subseteq A_n$. We claim that for every $A \in \ker(\pi)$ and $B \in V$, $A\pi(B) = \pi(B)A$. In other words, $\ker(\pi)$ and $\im(\pi)$ commute. Indeed, if $A,B \in V$ and $A = -A^t$ then \[ (A + B)(A + B)^t = (A + B)^t(A + B) \Leftrightarrow \] \[ \Leftrightarrow AA^t + AB^t + BA^t + BB^t = A^tA + A^tB + B^tA + B^tB \Leftrightarrow \] \[ \Leftrightarrow AB^t - BA =-AB + B^tA \Leftrightarrow A(B + B^t) = (B + B^t)A \Leftrightarrow \] \[ \Leftrightarrow A \pi(B) = \pi(B)A. \]Our bound on the dimension on $V$ follows from the following lemma: Lemma. Let $X \subseteq S_n$ and $Y \subseteq A_n$ be linear subspaces such that every element of $X$ commutes with every element of $Y$. Then \[\dim(X) + \dim(Y) \le \dim(S_n)\] Proof. Without loss of generality we may assume $X = Z_{S_n}(Y) := \{x \in S_n : xy=yx \quad \forall y \in Y\}$. Define the bilinear map $B : S_n \times A_n \to \CC$ by $B(x,y) = \tr(d[x,y])$ where $[x,y] = xy - yx$ and $d = diag(1,...,n)$ is the matrix with diagonal elements $1,...,n$ and zeros off the diagonal. Clearly $B(X,Y) = \{0\}$. Furthermore, if $y \in Y$ satisfies that $B(x,y) = 0$ for all $x \in S_n$ then $\tr(d[x,y]) = -\tr([d,x],y]) = 0$ for every $x \in S_n$. We claim that $\{[d,x] : x \in S_n\} = A_n$. Let $E^j_i$ denote the matrix with 1 in the entry $(i,j)$ and 0 in all other entries. Then a direct computation shows that $[d,E^j_i] = (j - i) E^j_i$ and therefore $[d,E^j_i + E^i_j] = (j - i)(E^j_i - E^i_j)$ and the collection $\{(j - i)(E^j_i - E^i_j)\}_{1 \le i < j \le n}$ span $A_n$ for $i \ne j$. It follows that if $B(x,y) = 0$ for all $x \in S_n$ then $\tr(yz) = 0$ for every $z \in A_n$. But then, taking $z = \bar{y}$, where $\bar{y}$ is the entry-wise complex conjugate of $y$, we get $0 = \tr(y \bar{y}) = - \tr(y \bar{y}^t)$ which is the sum of squares of all the entries of $y$. This means that $y = 0$. It follows that if $y_1,...,y_k \in Y$ are linearly independent then the equations $$ B(x,y_1) = 0, \quad \ldots, \quad B(x,y_k) = 0 $$ are linearly independent as linear equations in $x$, otherwise there are $a_1,...,a_k$ such that $B(x,a_1 y_1 + ...+a_k y_k) = 0$ for every $x \in S_n$, a contradiction to the observation above. Since the solution of $k$ linearly independent linear equations is of codimension $k$, $$ \dim(\{x \in S_n : [x,y_i] = 0, \text{ for } i=1,..,k\}) \le $$ $$ \le \dim(x \in S_n : B(x,y_i) = 0 \text{ for } i= 1,...,k) = \dim(S_n) - k. $$ The lemma follows by taking $y_1,...,y_k$ to be a basis of $Y$. Since $\ker(\pi)$ and $\im(\pi)$ commute, by the lemma we deduce that \[\dim(V) = \dim(\ker(\pi)) + \dim(\im(\pi)) \le \dim(S_n) = \frac{n(n + 1)}{2}.\] | |||||||||
© IMC |