You are not logged in.
Pages: 1
I have this problem that I can't figure out.
a_n is a sequence of positive #s. For each n in the natural #s, b_n = (a1+a2+...+an)/n. And I have to use this to show
∞
∑ b_n
n=1
diverges to positive infinity. Any1 have any ideas?
Last edited by woodoo (2007-02-12 05:42:16)
Offline
Let me try:-
I am not using any standard test, just logic.
b_1=1/1=1
b_2=3/2=1.5
b_3=6/3=2
b_4=10/4=2.5
b_5=15/5=3
b_6=21/6=3.5
b_7=28/7=4
The sum of n natural numbers is given by the formula n(n+1)/2.
When this is divided by n, (n+1)/2 is obtained.
∞
∑b_n is the same as
n=1
∞
∑ (n+1)/2.
n=1
As n->∞, the sum is also ∞, therefore the series is
Divergent!
Have I made a mistake somewhere? I guess not
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
While you're right for that sequence, I think you need to be more general. Instead of choosing a_1, a_2, a_3 to be 1, 3, 6 etc., I think it's required to show that it's true for any set of numbers.
Why did the vector cross the road?
It wanted to be normal.
Offline
Pages: 1