You are not logged in.
Pages: 1
Hello.
I am trying to teach myself infinite series. I recently posted on this forum questions involving the relationships between rational expressions and hyperbolas and because I am still in that mindset, I am using that example.
I have managed to work out that the series expansion or infinite series or infinite sum or whatever it is called for a function
will be of the form
such that
if computed will yield the desired function. I have confirmed this in Maple.
How do I go about doing this by hand so as to show all my work? Computing it in Maple is easy but I want to know how to do it for myself. I attempted the problem myself, treating it as an integration problem but that didn't work and it makes sense that it didn't work because I am counting to infinity instead of finding the area under a curve.
Any help is appreciated.
Offline
Hi;
Series expansions only converge for specific ranges. Do you want to expand around x = 0 or something else?
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Whatever works; whatever gets the desired result. If x=0 works then yes.
Offline
Hi;
A Taylor polynomial is a truncated Taylor series. In numerical mathematics we use approximations for computation. If you intend to plug in small values into your function then you would produce a Taylor series around 0.
It is easy to demonstrate the work to produce such a formula.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
I see what you are saying and I already know that. When it is around zero it is a Maclaurin series. My original question is, how do I go from
to
without the aid of a computer program like Maple? Is there a way to compute it by hand like an integral? That is the information I am lacking.
Offline
It is easy to go from the function to the series but to go back might not be as easy since I have never seen it done. If you were to undo the differentiations that still would not produce the original function just another polynomial.
When you take the Mclaurin series of sin(x) you get :
Now it is obvious that no matter what we do to that we will never get sin(x) back.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
So the Taylor (Maclaurin) series is just a way to represent a function and isn't necessarily meant to be solved except by computer. Got it.
Thanks for your help.
Offline
Hi;
Yes, they are the same thing for certain values only. Sometimes when you see the series on the right you are able to recognize which function it belongs to.
In most cases you will not. You could always curve fit but you must again already know the form on the left.
So the Taylor (Maclaurin) series is just a way to represent a function and isn't necessarily meant to be solved except by computer.
The Taylor series is very useful in numerical work. It is used to approximate functions and to develop new methods
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Thanks for all the info.
Concerning the notation, is it just as proper to write
in the form
or is the second form superfluous? Are either one considered more correct?
Offline
Hi;
I would say both are correct but I prefer the first one.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Yeah, I think I do too. The second one feels unnecessarily elongated only to say the same exact thing.
Offline
To remember that might be useful if you could get the sum in analytical form for an upper bound of n. Then you could use the laws of limits to get the sum to infinity. This is only theoretical as far as I am concerned. It is almost never easier to get the sum to n rather than infinity.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
It is easy to go from the function to the series but to go back might not be as easy since I have never seen it done. If you were to undo the differentiations that still would not produce the original function just another polynomial.
When you take the Mclaurin series of sin(x) you get :
Now it is obvious that no matter what we do to that we will never get sin(x) back.
We can form a second order differential equation and get it from there.
Here lies the reader who will never open this book. He is forever dead.
Taking a new step, uttering a new word, is what people fear most. ― Fyodor Dostoyevsky, Crime and Punishment
The knowledge of some things as a function of age is a delta function.
Offline
We can form a second order differential equation and get it from there.
How?
bobbym: Which notation of the initial statement would you say is the most correct?
or
or
or something else? They can all be interpreted to mean the same thing but I do not know if any of them are better than the rest.
Offline
Hi;
I would use the first one.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Pages: 1