You are not logged in.
Pages: 1
Given this theorem:
Let a,b ∈ Z, not both equal. Then a greatest common divisor d of a and b exists and is unique. Moreover, there exist integers x and y such that d = ax + by.
I need to prove the unique part of the theorem. I figured out the proof of the existance, but I have no idea how ot prove that the gcd of a and b is unique.
After that, I also need to prove the following:
Let a, b ∈ Z, not both zero. Let S = {n∈Z | n=ax + by, for some x, y ∈ Z}. Let d = (a,b) where (a,b) is the g.c.d of a and b. Prove that S is the set of all integer multiples of d.
I used the set S in the previous part to prove the first part of the theorem, and I'm guessing i need to use the theorem and manipulate it somehow to prove that S is the set of all integer multiples of d.
Can anyone help?
Offline
To prove it's unique, let d = gcd(a, b), and then assume there is another gcd, e. It should be fairly easy from here, because one property of the gcd is that it is the greatest common divisor of a and b. Can you have two greatests?
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
For the second one, you use the fact that if d = gcd(a, b) and n = ax + by for some n in Z, then d | n. It should be straightforward from there.
And Kazy, what math are you in? You been asking for proofs from abstract algebra, set theory, number theory, and probably others that I forget. I'm just kind of curious.
Last edited by Ricky (2006-04-04 13:59:18)
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
Thanks! The book is called "Introduction to Abstract Mathematics", the class is titled "Intro to logic for secondary mathematics". Thanks for all your help.. i took this class as an elective thinking it would be easier than most of the other upper level math classes, but its turned into a real pain in the math.
Offline
for the 2nd part, I see that d | n but I don't see how d | n can show that S is all the integer multiples of d
Offline
If d | n, then n is a multiple of d, and n is in S.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
Pages: 1