You are not logged in.
Pages: 1
I have complex function f(z)= (z-w)/(1-w' z). Here, w is a fixed complex number on the unit circle and w' is the conjugate of w (so if w=a+bi, then w'=a-bi).
I have to show that if z is in the unit disc, then so it f(z).
This means I have to show that |(z-w)/(1-w'z)| is less than or equal to 1 whenever |z| <= 1, right?
So I went ahead with the statement |(z-w)/(1-w'z)| <= 1 and said that this holds whenever |(z-w)| <= |(1-w'z)| holds because I can "distribute" the norm over any product and, because the norm is always positive, I can multiply both sides of the inequality by the normed denominator and obtain the second form in that way.
But, when I substitute z=x+yi and w= a+bi and compute both sides of the inequality, I get at the end that the original statement holds whenever x^2 + y^2 <= x^2 + y^2, which is always true for any complex number. This seems to say that the function f takes EVERY complex number to the unit disc. Also, nothing seems to go wrong if I replace the inequality by strict equality, which says that f(z)=1 whenever x^2+y^2 = x^2 + y^2. Does this say that f takes every complex number to the unit circle because this is true of every complex number (|z|^2 = |z|^2 always)?
I went over the computation a couple of times, but found no errors. Still, I find the result very odd. Can someone tell me whether I screwed something up in my process? I feel like I'm committing some grievous algebraic sin or something, but cannot figure out what it is.
Last edited by almost there (2010-04-05 05:35:01)
Offline
Pages: 1