Here's a neat proof of 1. using matrix methods:
Consider
and its transpose
. Then
where
is the identity matrix. This implies that
(so
is an
orthogonal matrix), and so we must have that
so by equating entries, we obtain the result.
(Long) Edit: /0 gave me a major hint by showing that
, which is a necessary (but not sufficient) condition for the matrix
to be orthogonal. The proof above though shows that the conditions of the question are necessary and sufficient for
to be orthogonal. That is,
This basically follows from the fact that a
matrix with real entries
,
.
Here
denotes the standard
inner product of two (column) vectors
. It is closely related to the dot product;
The linearity of the inner product means that we only need to show that
where
is some
basis for
. We can clearly choose
, so that
, where
is the
Kronecker delta. We can then prove the result:
If
, then
Conversely, if
, then
which is the entry of the
-th row and
-th column of
, which implies that
.
So the original question can be restated as:
If
, show that
, and the result holds because both expressions are equivalent to
being an orthogonal matrix. The result can clearly also be generalised to the
dimensional case, though of course it can't be written down so neatly.
There's also a similar problem for complex matrices; instead of considering orthogonal matrices, we consider
unitary matrices. A matrix with complex entries is unitary if
; here
denotes the
conjugate transpose of
, obtained by taking the transpose of
and conjugating each entry.
Thus the complex version of question 1. is
(Can you tell I'm procrastinating from studying algebraic topology?
)