Trace det
The answer is that, for any square matrix \(A\), the trace is the sum of eigenvalues, that is,
and determinant is the product of eigenvalues, i.e.,
So maximizing the trace or determinant corresponding the eigenvalues of \(A\).
I think that using the Forbenius norm as the objective function is incorrect. To make sure that we understand thoroughly, we go through the basics of covariance matrix. Consider the data matrix \(\mathbf{X}\), with each column as the observation of the same data, and each row as a measurement type. The covariance matrix is defined as
The element \(c_{ij}\) of \(\mathbf{C}\) measures the covariance of the \(i\)-th and the \(j\)-th measurement types, and when \(i=j\) we call it variance of measurement type \(i\). Principal component analysis (PCA) aims at maximize the variance maximizes \(\mathbf{C_X}\)'s diagonal value, that is, the variance, while minimizes** the off-diagonal value. Since \(\mathbf{C_X}\) is diagonalizable, Its objective function is formulated as
or
Back to the Fisher discriminant analysis is to maximize the between-class variance and minimize the within-class variance, thus its objective function is formulated as
or
Since a Forbenius norm is
it takes all elements in \(A\) into account. It doesn't make sense.