because when im trying to solve it im getting negative of the expression
i don't like playing with alpha, beta and gammas so im using
alpha = x beta = y gamma = z
Here it is:
you take the right hand side(RHS) of the expression and then expand it
RHS= (y-x)(z-x)(y-z)
= (yz-yx-xz+x²)(y-z)
= y²z-xy²-xyz+x²y-yz²+xyz+xz²-x²z
(now rearrange them and combine like terms)
= y²z-yz²+xz²-x²z-xy²+x²y
= -1(-y²z+yz²-xz²+x²z+xy²-x²y)
= -1[(yz²-y²z)-(xz²-x²z)+(xy²-x²y)]
=
this is what im getting...so check your question again
]]>remember you can do greek letters by putting a \ in front of the name of the letter.
I finally Got It Yeah Now Prove This Please
]]>If you want further reading on LaTeX or to clear up how to program matrices then go to this excellent website (which is a must for your 'favorites'):
http://www.maths.tcd.ie/~dwilkins/LaTeXPrimer/
EDIT: 'Matrices and arrays' shouldn't be far down the page
]]>