You are not logged in.
I had a weird thought the other night as i was dozing off.
the product of two matrices A and B (AB) is the matrix who's (i,j)th element is the dot product of the i'th row of matrix A and the j'th collumn of matrix B.
What if we were to redefine matrix multplication such that it is always associated with some inner product of two vectors in R^n. That is, suppose we define a place where the (i,j)th element of AB is equal to the inner product P of the i'th row of A and the j'th collumn of B.
Then the whole of study of matrices (at least as far as I know it) is reduced to a mere subset of all possible matrix laws in which the inner product is chosen to be the dot product.
and now of course i'd like to know, has this been thought of already?
A logarithm is just a misspelled algorithm.
Offline
Dot product is matrix multiplication. The dot product of the vectors
[list=*]
[*]
is the multiplication of the 1×n matrix
[list=*]
[*]
by the n×1 matrix
[list=*]
[*]
Multiplication of matrices with more rows and columns is a way of doing multiple dot products simultaneously.
Me, or the ugly man, whatever (3,3,6)
Offline
Not an answer, but might be interesting.
Let A and B be real matrices, then (AB)ij=∑kaikbkj=⟨Ri(A),Cj(B)T⟩ where Ri(A) denotes the i-th row of A and Cj(B) denotes the j-th column of B and ⟨−,−⟩ denotes the standard inner product on Rn. (So indeed, a matrix product is nothing but a bunch of inner-products).
Now suppose A is a real n×n-matrix and AAT=I. Then (AAT)ij=δij. Now notice that Ci(AT)T=Ri(A) by definition of AT. Thus (AAT)ij=⟨Ri(A),Rj(A)⟩=δij. It follows that the rows of A form on orthonormal basis of Rn. This also explains why a square matrix satisfying AAT=I is called orthogonal.
This shows that this way of thinking about matrix multiplication can be interesting. (For example: try to find the analogues for complex matrices).
Offline