You are not logged in.
It seems that there are several of us here who would like, from to time, to discuss stuff which is outside the range of interests of those who come looking for elementary help in math. Any chance of a new section with a sufficiently enticing title (and a reassuring sub-title.... to the effect that it ain't as scarey as it sounds)?
Yes, but these are the same number
(yes, I've realized that now, sorry!)
They're just presented differently, 4 or 7-3 or 3,999... but if a number like 3 would equal another number 4 you'd have big problems!
Hey, don't be sorry. This is still a deep discussion in the philosophy of math! In what sense is 2 + 3 the same as 4 + 1? They share only one element, the = sign. In philosophy, it is by no means a trivial question whether 0.99... = 1 or not.
In math, we just define it thus, and move on to the really good stuff
The proof that inverses are identically left and right is a bit more tedious, but equally trivial. If anyone wants a crack, remember that (a^-1)^-1 = a.
Well you guys are no fun! Here.
Let
be a right inverse. We know that ae = a. Note first that . by associativityThis suggests that last term is a right inverse
But
by assumption, so I haveTraLa!
It's the left v right that seems confusing, BTW. I've never seen them used in such context.
Yes, it can be confusing. Just remember this. Obviously, 1 + 2 = 2 + 1, it's called commutativity. But there are plenty, plenty situations where xy does not equal yx. So every time you see xy = z you may not simply assume that yx =z. This would imply that xy = yx, and you gotta prove it! That's all I meant by "left and right" identity, if ae = a, I have to prove that ea = a.
Go on! Try it with the inverses, it's easy, just inch forward.
Sorry ben. In an attempt to write a quick post, I was too brief.
No problem, math is fun, right?
Consider the equation:
But why should I? Where did it come from, other than the taylor series I showed?
And so it must be that:
Why? It could be, by your reasoning, that
Anyway, let's not fall out in public. Others seem to have lost interest anyway.
I need more math classes!
Well, you're in the right place then! But what I did was even easier than falling off a log. Look.
Group theory is an element of Abtract Algebra (so called), where we are not allowed to take our intuition about real numbers for granted, so everything has to be proved from the ground up, so to speak.
Here's a secret. It ain't that hard.
Shouldn't the highlighted equation read: "a + e = e + a = a" ?
--All Is Number
Yes you are right. Why didn't I spot that? However, we may not assume that, if e is a right (resp. left) identity then it is also a left (resp. right) identity.
However, the proof is trivial. Let ab = c and ae = a. Then, because we insist on associativity of the group operation, we have that (ae)b = a(eb) = c, which implies that eb = b, so e is also a left identity.
The proof that inverses are identically left and right is a bit more tedious, but equally trivial. If anyone wants a crack, remember that (a^-1)^-1 = a.
Wouldn't a line parallel to the z-axis still cast one tiny dot on the x-y plane whereever the (x,y) of (x,y,z) is?
Yup! Good work. We call that "tiny dot" the zero projection of z on both X and Y. So you see that if the projection of z on both x and y is zero, then z is perpendicular to both.
Now, as we like to invent words to make ourselves look clever, we say that z is orthogonal to x and y (actually I'm kidding. If I told you the real reason we don't use "perpendicular", it would scare you witless!)
Wanna see it? I mentioned the "dot product" (I hate that term). A lot of texts use z·x to mean the projection of z along x (hence the dot in dot product). And if z·x = 0, as you rightly guessed, z and x are mutually perpendicular or, harrumph, orthogonal.
Whence the even famouser
The 5 fundamental transcendentals, all in bed together (but no action!!)
Not a plane intersecting the 3-D graph, but more of a shadow of the 3-D graph onto the x-y plane, so
essentially Z just goes away and the x,y points are still valid.
It would be like looking at the 3-D graph in many parallel lines with the z-axis.
Your intuition was right, the construction you are referring to is included in the study of vectors.
But lookee. You want to look at the projection of a 3-vector onto 2-space (that's just me being fancy with your own words). Again, you guessed right - the image of a point, or set of points, or vector, whatever from a higher to a lower dimension is called a projection.
Lookee further. Having projected your 3-vector onto the XY plane, what do you do with this "shadow"? Well personally I would evaluate it in terms of the X and Y axes. So effectively you are asking - what is the projection of your 3-vector along each of the X, Y and Z axes. This is an interesting construction that goes by the name of inner (or scalar, or dot) product.
Here's something to think about. Suppose you have a vector that casts no shadow on either the X or Y axes? What would you say about that. Here's hint: on a bright sunny day, under what circumstances can you not see your own shadow?
The ideal <x,y> consists of the polynomials without constant term. Especially, it's not F[x,y]. So, if <x,y> = <g> for some g in F[x,y], then g is no unit. On the other hand, it's a greatest common divisor of x,y. But then g = ax and g = by for some a,b in F^*, a contradiction.
Umm. Maybe we should agree what we mean by an ideal in general terms? Like in any ring?
But you're being really naughty here. What for the love of all that's holy is F*? Where do a and b come from? What do you by a unit? You really must define your terms. I'm not just being nit-picky here, I truly have no idea what you're talking about
But note that 0,333... is not 1/3, it's an approximation.
No, it's the best our notation can render. Everyone except you understands that. The onus is on you to show which is an approximation. 1/3 or 0.33.... Mathematically, that is.
It's just slightly less than 1/3.
Watch your logic here. If you want to argue that 1/3 in not equal to 0.33.. you are not allowed to use that as an assumption.
It approaches 1/3 like a limit, but never becomes it.
It has absolutely nothing, nothing to do with limits.
Anyway, I'm running away from this thread. The like get me cross, and I have no wish to fall out with anyone here.
Yes,
OK, I see I need to build from the bottom. I am happy to try that.
Ricky, George - waddya think? Where do I start? Sets? Define abstract
algebra? Any way this to Ricky
Prove
that <x, y> = {xg + yh : g, h are in F[x, y]} contained in F[x,
y] is not a principal ideal domain.
And I'm not entirely sure if this is valid:
Assume that (xg + yh) generates <x, y> for some polynomials g, h
in F[x, y]. Then there must exist a polynomial f such that (xg + yh)f =
x, since x is in <x, y>. So it must be the case that h = 0,
otherwise you would have a y term. The same reasoning stands that g = 0
since we need to generate y. But if g and h are 0, then you generate
nothing but 0. Contradiction.
What do you think? Valid?
I hope you have solved this because I'm really rusty on this, so I'm
sort of feeling my way. But I suspect
you need to take advantage of the fact that every PID is also a UFD.
But I'm a bit confused by your notation. The way I learned it, F[x,y]
means a polynomial ring in 2 variables, x and y, with coefficients
taken from the ring F. But you have <x,y> generated by {xg + yh},
g and h in F[x, y]. I don't quite see what's going on here. Please tell.
Anyway, this is as close as I can get. Let F be a field and F[x, y] be the ring of polynomials over. Let <x,y> be generated by (xg + yh).
The ideal I is generated by the polynomials f(xh + yg) = x and
k(xh + yg) = y. Since no element of F[x, y] divides both f and g except for elements of the field F, and elements of the field are not contained in the ideal, the ideal cannot be principal.
And I tell you now, I am not in the slightest convinced by that. What did you you finally get?
Who was the smart guy who proved you couldn't have something on both ends of an infinite number of things, ben?
Dunno, Cantor maybe? He was interested in different sorts of infininty (!). He also went bonkers......
I've heard you can't make a set that way
You mean you can't have an infinite set? Of course you can
for me it sounds like the number that is between those numbers are 10^(-∞)
I realize this was tongue in cheek, but it does show a common difficulty. Infinity can not be treated like "just some hugely big number". It is an abstract concept.
Like, infinity is what there is when you run out of counting numbers. Huh? You simply cannot manipulate it as though it were an element of R. So 10^(-∞) has no meaning. Sorry to appear tough!
(Actually, there is a real line, called, I think, the projective real line, or something like that, which does precisely that. Try googling, as I know FA about it)
Is (...) supposed to mean limit? Now that's new.
Besides, I think this would be a valid argument against it:
Certainly not! Here's your homework.
Is 1/3 rational or not?
Is 1/3 + 1/3 +1/3 equal to 1?
Is 1 rational or not?
What is the decimal represenation of 1/3?
Of 1/3 + 1/3 + 1/3?
See where it goes?
No! Please don't write 0.0.......1, it has no meaning. Rather write 3.9..... or 3.99....... etc. The elipses imply an infinite number of nines. As we really don't have time to write out 9 an infinite number of time by convention we call that number 4.
Or, to be cute. If 3.99.... is not equal to 4, you have two choices. Either tell us what lies between these two numbers, or you assert that the real line is not continuous. I promise you, either option will result in a very serious headache.
Well,
first I hadn't intended by question as a challange, I wanted to know
for myself. I've got it figured now.
Second, Ricky, I can't see what you did there. You assumed the result I
wanted to see shown in order to find what I wasn't looking for! Or have
I missed something?
Anyway, with the help of some textbook, I see it.
Note first that z = x + iy here. Note also that e^z is analytic
throughout the complex plane: it is an entire function. We expect there
to be a taylor series therefore.
Recall that the taylors for the circular trigs are
We now want to let x = z = x + iy. Notice that even powers of i are all negative 1 or i, odd powers are positive 1 or i, alternating.
Remembering that
All I need do now is remind you the expansion for e^-x is
and you should easily see (by comparing the appropriate taylors) that
And as we know e^z is an entire function, we can assume the same is true here.
Now that's something that really does deserve to be called cool!
So,
andWhy? Anyone know? z = x + iy, by the way
Ricky, as you know I'm new here, so I may have mistaken the level of my pitch. Overpitched, you think? Your intro to symmetric groups was a lot more elementary than mine (nonetheless quite correct). I'm sure other members are grateful for that.
Nevertheless, I shall continue on my course. Remember, I am quite happy to expand, explain, illustrate and generally dance the fan-dango.
OK, if anyone is still awake, I have a lot more to say about subgroups, but first I think I need to do this.
When thinking about group theory it is useful to return to our favourite groups, R (the reals) N (the natural numbers)and Z (the integers). But his can be misleading in lots of things I now want say. For example, a lot of stuff which seems self-evident in R, Z and N actually may not apply to other groups. So, to give a taste of a quite different sort of group, I will introduce you to an important and versatile group.
Consider the set {1, 2, 3}. Now consider the set {1, 3, 2}. It's not too hard to see what I've done, and let's write that exchange as (2, 3) i.e. exhange 2 with 3 and 3 with 2 (keep a close eye on the way brackets are used here!).
That operation is referred to as a permutation, so let's try to figure aout how many permutaions there might be on the set {1, 2, 3}.......pause for thought......That's right, for each set with n elements, there are n! permutations.
What we have here, then, is a set with 3 elements whose permutations are themselves the elements of a 6 element group called the symmetric group S_3. In fact, with a bit of practice, it's not too hard to do these sort of permutations in one's head - basically this is matrix algebra. It is easy to see that this is a non-abelian group, as are all S_n for n > 2. The connection with matrix algebra is really nice, which I'd be more than happy to expand upon (if asked).
OK, to return to subgroups. If, for some G with subgroup H we have that ghg^-1 is in H, for all g in G, all h in H, H is said to be a normal (sometimes called invariant) subgroup. Note that, as this must be true for all h in H, this condition is often written gHg^-1 = H.
Also note that every group G has at least one normal subgroup - G itself! Where this is the case G is said to be a simple group. Note also that gHg^-1 = H implies that gH = Hg. So here's another definition: gH and Hg are called the left (resp. right) cosets of H iff g is not in H. This is an incredibly important construction, so let's see a simple example. Consider the group 3Z, those integers exactly divisible by 3. This is an abelian subgroup of Z, so has 0 as the identity e, addition as the operation. The left cosets of 3Z are 1 + 3Z, 2 + 3Z, 4 + 3Z, 5 + 3Z etc. Or are they? Elements of 3Z are ...-3, 0, 3, 6 etc. What's the difference between 1 + 6 and 4 + 3? This brings us to something really, really important.
I invite you to look at these cosets. Notice anything? What's the relation between 1, 4, 7....on the one hand and 2, 5, 8....on the other? Yup, they differ by an element of 3Z. This defines what is called an equivalence relation. To formalize: given a normal subgroup A of a group G, gh^-1 is in A defines and equivalence relation ~ on G iff the relation is
reflexive: g ~ g
symmetric: g ~ h ==> h ~ g
transitive: g ~ h and h ~ k ==> g ~ k
for all g, h, k in G.
This is an example of a universal construction, and is highly important. This post is already over long. More later, if anybody wants.
After Ricky's nice concrete introduction to groups, let me burden you all with a degree of abstraction. First let's have a definition:
A group G is a set which has
a well-defined, associative and closed operation
an identity e
an inverse.
Ricky covered all this, so let's unpack it a little. In group theory, the group operation is always referred to as "group multiplication". But don't be fooled: group theorist don't always mean multplication in the arithmetic sense; it might mean addition (they mean something like "a multiplicity of group elements") There is a convention which I'll expailn later.
The operation, group multiplication, is said to be "closed" if the result of the operation is still an element of the group (are the primes then a group?).
The identity e is as Ricky described: 0 when the operation is addition, 1 when it is multiplication.
The inverse is again as Ricky said, respectively -x and 1/x.
Now some notational conventions. Some authors use the centre dot to denote the group operation thus a·b, but most use juxtaposition thus ab. Don't assume this means arithmetic product, though, we might be in an additive group.
Similarly most authors use x^-1 for the inverse, to be interpreted according to the operation in question
With this and the previous posts in mind, we see that if a, b are in G, ab = c implies c is in G
We also see that ae = a , and aa^-1 = e. Note that these last equalities refer to what are called right identity and inverse, respectively. The left identity and inverse follow follow from the group axioms. (Any body want to try the simple proof?)
Now, perhaps the most important distinction we can make between different sorts of groups is this: if the operation on the elements h, g of a group G is commutative (i.e. gh = hg) the group is said to be abelian, otherwise non-abelian. I referred earlier to a notational convention: for abelian groups the operation is said to be addtion.
Now non-commutivity need not throw us into a panic - think matrix multiplication for example, but the notation can be misleading at first. ab = c need not imply that ba = c, but it is always the case that ab = c means a = cb^-1 = b^-1c, by my axioms above.
One more word on commutivity; it, or the lack of it, is a property of the group elements, not of the operation (associativity would not follow otherwise).
Now there is another, even more abstract, way to describe a group, which I think is super cool: a group G is a set of sets, these being the underlyng set written |G|, the Cartesian set |G|×|G|×|G|....(this is where our operation lives) and an axiom set (the set of rules that tell us about identity, inverse etc.)
Roughly speaking, the order of a group G is the number of elements in the underlying set |G|. There's an interesting theorem of my buddy Lagrange, but first I need to tell you this.
Consider the groups G and H. If every element of H is also an element of G clearly G = H iff every element of G is also an element of H. But if there are are elements in G not in H we say H is a subgroup of G. As H is a group in its own right it must share the identity e with G, must have an inverse and must be closed under the group operation. Evidently the operation on G and H must be the same.
OK. Lagrange says that the order of any subgroup H of G divides the order of G. This is less easy to prove than it looks, but it's a really cool result.
I'm going to close (for now) with some examples of groups and things which aren't groups:
Z, the integers, is an additive abelian group
The even integers are an additive abelian group; the odds are not a group, neither are the primes
S_n, the set of permutations on n objects is a group, non-abelian for n > 2
The set of rotations in 3-space are a non-abelian group.
Tired of typing, more another time if anyone wants (we've scarecly started, it just keeps getting better!) But I am happy to answer questions, even happier to be corrected!
Hi Ricky, yes groups are fun things. I see you have a thread on the subject. May I barge in there?
Well, hello all. I'm interested in set theory, group theory, vector spaces, topology.....and all that stuff. Looking forward to some really good conversations here