You are not logged in.
OK please just put the monotunely...proof on in order to make it complete.
X'(y-Xβ)=0
Offline
Sure George. This is known as the Monotone Convergence Theorem:
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
Don't get too math with your proofs. They don't prove anything in my book.
igloo myrtilles fourmis
Offline
The word math in last post was c o c k y, but don't take it the wrong way, I am really impressed with the induction proof and I intend to learn more about the process in the future. But I am concerned that the n+1 and n+2 are not discrete values, they are just a little bit bigger. I didn't know you could do that.
igloo myrtilles fourmis
Offline
Ricky, I messed up a bit on my last post, was thinking one thing but typed another. Let me try again.
Lets say the limit of f(x)^x as x approaches infinity is 0.5 and 0 < f(x) < 1 for all x.
Now how about the limit of cos(x)^n as n approaches infinity. If at some point, cos(x) equaled f(n), then the limit of cos(x)^n at some point x would be 0.5. So would cos(x) equal f(n) at some point?
Well, f(n) is somewhere between 1 and 0. cos(x) makes its way from -1 to 1 so its bound to cross so there must be some point x such that cos(x) = f(n) and we will have a midpoint between 1 and 0.
Also remember, when I say some point where cos(x) = f(n) then I'm not speaking of any finite value n, I'm speaking of n as being equal to the "n" in the limit of f(n)^n as n approaches infinity.
Its kind of hard to explain, but hopefully you get my point. If my reasoning is flawed, please tell me where.
Last edited by mikau (2006-07-15 13:26:17)
A logarithm is just a misspelled algorithm.
Offline
Lets say the limit of f(x)^x as x approaches infinity is 0.5 and 0 < f(x) < 1 for all x.
You have just completely changed the problem. What you posted above shares absolutlely nothing similar to a^n as n approaches infinity.
You see, because a doesn't vary as n gets higher. But f(x) does vary as x gets higher. In the same way, cos(x) stays exactly the same when you do cos(x)^n.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
Is not infinity considered to be a constant even though it is limitless in size? If so, in the limit of cos(1/n)^n, isn't cos(1/n) still equal to some constant a? We kind of visuallize infinity to be increasing but isn't it still considered a constant?
Your wording seems to suggest that cos(x) will reach some finite value and stop, while the power n will continue growing to an infinite value and eventually dominate reducing it to be infinitly close to zero. But isn't it possible for a fixed consant value to be 1 - 1/infinity?
A logarithm is just a misspelled algorithm.
Offline
Is not infinity considered to be a constant even though it is limitless in size? If so, in the limit of cos(1/n)^n, isn't cos(1/n) still equal to some constant a?
\
No, because:
(1/n)^n and (1/a)^n are two extremely different problems. a doesn't change as n changes, but n changes as n changes. The same applies for cos(1/n)^n and cos(1/a)^n.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
but still the word "changes". Ugh... I just can't seem to translate my thoughts into words...
Take the limit of (1 + 1/n)^n as n approaches infinity. Now we like to think of n being sort of like a train, getting farther and farther and farther away, and as you said 1/n "changing" in relation to the power, and as it continues moving, it converges to what we know as "e". But isn't the reality that n is not really moving at all? Hmmm.... come to think of it, maybe this is where my confusion stems from. Perhaps the term "approaching" sets in motion and gives it a value that is ever increasing, a sort of expanding infinity rather then a constant infinity?
I might be thinking in an unconventional fashion. Or using limits when I should be using something else... Say we took a constant that was infinite in size but not expanding, its size remains at a consant. call it n. let a = 2n, and b = n. then by my definition, a/b = 2. Or let a = n and b = n^2, then a/b = 1/n and tends to zero. Or let a = 1/n + 1 and b = n then a^b tends to e.
This is kind of a strange way of using limits because I'm sort of stashing away infinite or infinetsimal values into constants, then pulling them out and evaluating them like limits. Like saying a = 1/n, b = n
then finding a*b tends to 1. I'm giving n a constant an infinite or infinitsimal value that is still constant and then finding what its value tends to be.
Last edited by mikau (2006-07-15 16:20:35)
A logarithm is just a misspelled algorithm.
Offline
If you are going to treat infinity as a constant, then you aren't dealing with the real numbers. Infinity is not in the real numbers. But just like you can use the rationals to approach the square root of 2 (an irrational number, i.e. not in the rationals), you can use real numbers to approach infinity.
But you can never set a real number equal to infinity, because then it isn't a real number. It's the equivalent of saying that n is a natural number and n = -5.3.
Here is a good example of where that line of thinking can get you:
Limit of a/n as n approaches infinity, where a is a real number.
If I let a = n, as you do above, then it's the limit of n/n, and so it's 1. But if I let a = 2n, which by your reasoning is also valid, then the limit is 2. I could keep going, so there are infinitely different limits for the same problem? Surely that's absurd.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
(Sorry to interrupt, but this seems like the *style* of discussion the ancient greeks might have had, or perhaps a group of philosophers in a parisian cafe. Its great.)
"The physicists defer only to mathematicians, and the mathematicians defer only to God ..." - Leon M. Lederman
Offline
Very good proof, Ricky. The main point is to admit a Maximum lower bound, then transform its property to N-∈ doctorine. I seem to get a little bit clue about why continuous reals is necessary for an ideal math world.
By the way, what does s mean? you can simply say that you can find term n less than L+∈ which is not a lower bound.
_________________________________________
To mikau
Defining infinity and infinitesimal to a number is first invented by mathematician Thomson, and is still controversial because rules and properties have to alter to welcome their joining in numbers. Some mathematician just say this step is unnecessary.
__________________________________________
If I let a = n, as you do above, then it's the limit of n/n, and so it's 1. But if I let a = 2n, which by your reasoning is also valid, then the limit is 2. I could keep going, so there are infinitely different limits for the same problem? Surely that's absurd.
--That's what I've insisted and you've admitted-reached state of infinity is artificial and can't live without moving cause.
X'(y-Xβ)=0
Offline
By the way, what does s mean? you can simply say that you can find term n less than L+∈ which is not a lower bound.
Let s be a typo. Then it stand that s should be correctly replaced with L. QED
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
If you are going to treat infinity as a constant, then you aren't dealing with the real numbers. Infinity is not in the real numbers. But just like you can use the rationals to approach the square root of 2 (an irrational number, i.e. not in the rationals), you can use real numbers to approach infinity.
But you can never set a real number equal to infinity, because then it isn't a real number. It's the equivalent of saying that n is a natural number and n = -5.3.
Here is a good example of where that line of thinking can get you:
Limit of a/n as n approaches infinity, where a is a real number.
If I let a = n, as you do above, then it's the limit of n/n, and so it's 1. But if I let a = 2n, which by your reasoning is also valid, then the limit is 2. I could keep going, so there are infinitely different limits for the same problem? Surely that's absurd.
It may seem absurd after gettting comfortable with limits, but to me, if first we say a = n and a minute later we say a = 2n, then we've changed the problem completely and we are getting two different answers for two different problems. Not two for one.
Perhaps it is fundamentally incorrect to let a consant be equal to infinity, if all real values must be finite, but it is in a way, rather interesting to think about, at least in my oppinion. Though it may be unconventional, but perhaps thats what would make it interesting.
Another possible solution is to say something like "int &a = infinity; " a refference to infinity! Or perhaps define a new type of number class that is not reals, or imaginary, for storing infinite or infinetsimal values. For the sake of the discussion, lets call them Elites. An elite number is a number a and b Such that you could say a = 1 + 1/infinity and b = infinity, and equate a^b = e or tends to e. You would have to define a fixed Elite value of infinity, say n. When we work with limits, we can take the limit of n/2n as n approaches infinity, in this case you would define n to be some infinite value, and 2n to be an infinite value thats twice as big as infinity! Sort of a paradox but it still works. It kind of lets you break up a limit and store its parts, then put em back together and evaluate.
So if n is defined as a constant infinity as I said, take a/n, can we find what the value tends to be? Sure! But we must be told precisly what kind of number a is. Is it an elite value? Where its position on the number line is either constant infinity or some type of asymptote that is infinitly close to but not equal to a certain value (though its closeness would be constant)? Or is it a real number and it has an exact position? If it is an elite number, then the limit could be anything and we'd need to know more about it to find the limit, but if it is a real number we can just say the limit tends to zero. This "n" would not be redifining infinity, as this Thomson suggested, but defining a different type of infinity for a different application and thus would not alter any existing methods.
At this point, I suppose I'm crashing this whole discussion by defining my own rules but I think I've found where the missunderstanding between mine and ricky's thinking is. Infinity is not considered a real number and thus you can't really store infinite or infinetsimals in a constant. Now everything is clear and all ricky's points seem valid by that definition.
A logarithm is just a misspelled algorithm.
Offline
Right you are, mikau. As an earlier discliamer, I said that this stuff only applies to the reals. Most of what I said is untrue for surreal or "Elite" (good name by the way) numbers.
But real numbers are hard enough. Once I understand them well, maybe then I'll think about moving on to other number systems.
But as for right now, I'm all about keeping it real.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
Wasn't it you who said "Screw the REAL world?" ;-)
A logarithm is just a misspelled algorithm.
Offline
Still it is rather confusing. I mean its kind of strange that the graph of cos(x) can not be infinitly close to but less then 1 because this would require a value of x that was infinetsimal and not "real" even though the value of x would be somewhere between 1 and -1, it would have no real location. Terribly confusing...
A logarithm is just a misspelled algorithm.
Offline