Math Is Fun Forum

  Discussion about math, puzzles, games and fun.   Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °

You are not logged in.

#1 2006-10-14 05:46:55

abc4616
Member
Registered: 2006-10-01
Posts: 9

Correlation Question

Given random variable X with standard deviation x, and a random variable Y=a + bX, where a,b are constants, show that if b<0, the correlation coefficient = -1, and if b>0, correlation coefficient is 1.

Does anyone know how to prove this?

Offline

#2 2006-10-16 02:46:26

fgarb
Member
Registered: 2006-03-03
Posts: 89

Re: Correlation Question

I really need to learn more statistics myself, but I think this is how you should approach it:

Definition of the correlation coefficient:

[align=center]

[/align]

V is the covariance, and I'm using the sigmas in place of the lower case letters for the standard deviations here - it can get confusing otherwise. Now try to express everything in terms of X. For example, if you write out the definition of covariance you get Mean(XY) - Mean(X)*Mean(Y). You can plug in for Y in terms of X here. You should find that the covariance only depends on the standard deviation of X.

Then, also express the standard deviation of Y in terms of the standard deviation of X and if you plug everything in to your formula all the X dependence should cancel out - if not, you know you made a mistake. Also, remember that the standard deviation has to be positive, but the covariance can be negative. That's how you will get a negative answer for the case b < 0. Please ask again if that doesn't make sense!

Last edited by fgarb (2006-10-16 02:48:25)

Offline

Board footer

Powered by FluxBB