You are not logged in.
Pages: 1
(Summation is assumed)
I think its normally this
Offline
What book?
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
What book?
Mathematics of Classical and Quantum physics by Fredrick W.Byron, Jr. and Robert W.Fuller
Oh, I think I know what's going on, he defines
Last edited by Dragonshade (2009-07-14 17:13:38)
Offline
The page in question is here.
By "x_j" what the author means is the vector (0, ..., 1, ..., 0) where the 1 is in the jth component. This is of course relative to the basis, so the real vector (not written just as components) is:
Hopefully that makes sense. Now multiplying this element by the matrix should reveal the vector:
But since we write this in terms of the basis {x_i}, this is really:
As the author claims.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
So this is just a definition. He wrote that vector in term of basis x_i with corresponding column(j) elements. So it could have been row(j) elements, but they are equivalent?
Offline
I corrected a typo in my last line, the last term was x_j and it is now x_n.
This is not a definition, it's just using the standard definition of matrix multiplication on the vector (0, ..., 1, ..., 0).
So it could have been row(j) elements, but they are equivalent?
No, whenever doing matrix multiplication (i.e. applying a linear transformation), vectors are always column vectors.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
I corrected a typo in my last line, the last term was x_j and it is now x_n.
This is not a definition, it's just using the standard definition of matrix multiplication on the vector (0, ..., 1, ..., 0).
So it could have been row(j) elements, but they are equivalent?
No, whenever doing matrix multiplication (i.e. applying a linear transformation), vectors are always column vectors.
oh, ok. Got it. later I saw the author used put the eigen vectors in columns instead of rows. Thanks : )
Offline
Pages: 1