You are not logged in.
Pages: 1
Hi Guys,Can Anyone Explain Newtons Method,Gaussian Elimination,and continued fractions in a simple,easy to understand way?
Offline
Hi Fla$h;
Yikes! There are many books written on each of those.
Newtons iteration is a method invented by Sir Isaac Newton to solve equations. He used it only on polynomials but it has wide applicability. Basically it drops a tangent from one guess of a root to the x axis hoping that this new point is a better estimate of the root. Often it is.
The iteration looks like this:
When you have a more specific question, please post.
Last edited by bobbym (2009-08-17 18:48:59)
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Um..can you explain them to me??
Offline
Sorry..wrong post.What I mean to say is I dont have the patitence to read wikipedia(lol!)
Offline
Hi Fla$h;
Gaussian elimination is an algorithm invented by Gauss as a means to solving sets of simultaneous equations.
http://en.wikipedia.org/wiki/Gaussian_elimination
For continued fractions:
http://en.wikipedia.org/wiki/Continued_fraction
Please try to read these. Even quickly is better than nothing. I can only provide some examples I cannot reinvent mathematics. I will answer any question you have afterwards.
Here are some videos on gaussian elimination.
http://www.youtube.com/watch?v=P2abN3P32cY
http://www.youtube.com/watch?v=m3ooPCNaoTA
http://www.youtube.com/watch?v=YPK_gYjS … re=related
Last edited by bobbym (2009-08-17 16:29:40)
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
What I mean to say is I dont have the patitence to read wikipedia(lol!)
Then why do you expect to have the patience to read whatever it is we write? This site is mostly for answering questions, not replacing your textbook.
"In the real world, this would be a problem. But in mathematics, we can just define a place where this problem doesn't exist. So we'll go ahead and do that now..."
Offline
You Are Right.Well,i decided to read the article and I could understand it.But I Have A question for newtons method.In the Wikipedia article it is written:-
Newton's method can often converge remarkably quickly, especially if the iteration begins "sufficiently near" the desired root. Just how near "sufficiently near" needs to be, and just how quickly "remarkably quickly" can be, depends on the problem. Unfortunately, when iteration begins far from the desired root, Newton's method can easily lead an unwary user astray with little warning. Thus, good implementations of the method embed it in a routine that also detects and perhaps overcomes possible convergence failures.
Are There Any Guidelines or something for "good implementation"?
Offline
Hi;
A good rule of thumb, get close first and then use newtons.
There is a difference between textbook math and the real world. Textbooks overuse newtons but in fact in the real world you would use something with slower convergence but more robust to first locate the root. Then you would use newtons to zero in.
Newtons is fine for equations in one variable like y=x^2 -2. You would graph it first to get an idea where the roots are at. But for systems of non linear equations with many variables where you cannot graph them you use other methods to first get close. I have seen Bairstows method, squaring methods, steepest descent, graffe's, muller's, sturm sequences and dozens more used to get close. Wikipedia can bring you up to snuff on each of these but for a real education on root finding I suggest Forman S. Acton's books.
Solving equations numerically is an art form. Luckily the kind of problems you will see in books will not give newtons method much problems
Last edited by bobbym (2009-08-19 02:29:16)
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Something Like The Secant Method??
Offline
Hi Fla$h;
Yes, that has super linear convergence as opposed to newtons quadratic convergence(fancy term, roughly the number of correct digits doubles with each iteration). Also interval bisection is sometimes used to get close.
Solving equations numerically is an art form. Luckily the kind of problems you will see in books will not give newtons method much problems
Newtons has many many problems. It doesn't handle multiple roots well. It sometimes blasts off to the complex plane. It has trouble with local minina and maxima. It sometimes oscillates or even converges to the wrong root. But there are workarounds for all of these problems. With todays heavy use of black box software and packages much of this discussion does not apply.
Last edited by bobbym (2009-08-19 02:38:48)
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Pages: 1