You are not logged in.
I know it to be the difference divided by the average and multiplied by 100. My one professor said to calculate the % difference you basically: 100((a - b) / b). I'm doing linear approximation in my math book and it wants the % difference and it mentioned the same thing: 100((approx - exact) / exact). What is this? I'm confused.
Offline
Please see bobbym's signature.It says 'In mathematics, you don't understand things. You just get adjusted to them". It is true for percentage.Percentage is defined as (final-initial)/initial *100. If a quantity changes from 100 to 200 it is 100% increase. From 200 if it comes back to 100 it is only 50% down. It is defined this way to reduce heart attacks resulting from crash in stock market.Actually every body is confused about the definition but is not aware of it.
{1}Vasudhaiva Kutumakam.{The whole Universe is a family.}
(2)Yatra naaryasthu poojyanthe Ramanthe tatra Devataha
{Gods rejoice at those places where ladies are respected.}
Offline