The trace of a square matrix \(A = [a_{ij}]\) of order \(n \times n\text{,}\) denoted by \(\mathrm{tr}(A)\text{,}\) is defined as the sum of the entries along its main diagonal: \(\mathrm{tr}(A) = \sum_{i=1}^{n} a_{ii}\text{.}\)
The \(L_2\) norm of a matrix \(A\) is defined as the largest singular value of \(A\text{,}\) which is the square root of the largest eigenvalue of \(A^T A\text{.}\) We can verify that the value computed by Sage is indeed the largest singular value of \(A\) as follows.
Sage documentation states that the norm() method computes the Euclidean norm by default, whereas the actual value being returned is for the spectral norm (\(L_2\) norm). Sage documentation also implies that Euclidean norm is different from the Frobenius norm, which is not the case. In fact, the Euclidean norm and the Frobenius norm are the same and they are both different from the \(L_2\) norm.
For a matrix \(A_{n \times m}\text{,}\) the Euclidean Norm (also known as Frobenius norm or the Hilbert-Schmidt norm) is a vector-induced norm defined as the square root of the sum of the squares of all its entrees \(|A|_F = \sqrt{\sum_{i=1}^{n}\sum_{j=1}^{m} |a_{ij}|^2}\text{.}\)