The proof that the infinite sum of $frac{1}{n}$ diverges seems to have a fair amount of breathing room. We group successive terms in quantities of increasing powers, starting with $frac{1}{2}$, then $frac{1}{3} + frac{1}{4}$, then the next four terms, then the next eight terms, etc., and note that each of the groups is greater than or equal to $frac{1}{2}$, and adding $frac{1}{2}$ forever approaches $infty$.

As extra credit for this proof, each group after the first is strictly greater than $frac{1}{2}$, so the divergence actually occurs faster. Furthermore, we didn’t even need the terms to be that large; adding $frac{1}{1,000,000}$ forever would also approach $infty$. Why then, given this generous cushion in the proof, is it the case that $frac{1}{n^{1 + ε}}$ for some tiny ε converges? Why is the power of $n$ so fragile to nudges in the positive direction given how firmly $frac{1}{n}$ seemed to diverge?

To address the question why a small $epsilon$ is enough, note that analogously to the proof of divergence for the harmonic series, we can say that

$$frac1{2^{1+epsilon}} ge frac1{2^{1+epsilon}}, $$

$$frac1{3^{1+epsilon}} + frac1{4^{1+epsilon}} ge 2frac1{4^{1+epsilon}} = frac1{2^{1+2epsilon}}, $$

$$frac1{5^{1+epsilon}} + frac1{6^{1+epsilon}} + frac1{7^{1+epsilon}} + frac1{8^{1+epsilon}}ge 4frac1{8^{1+epsilon}} = frac1{2^{1+3epsilon}}, $$

and so on. Note that the terms on the right hand side are **no longer are constant** as they used to be for the harmonic series, instead they form a geometric progression with the factor $frac1{2^epsilon}$. When $epsilon$ is small, that value is just a tiny bit smaller than $1$.

Still any geometric sequence with factor $< 1$ will converge to $0$, even if slowly. That means if we take the sum of the right hand sides, it is no longer an infinite sum of $frac12$ that diverges, but a geometric series that does converge!

So the main “problem” in translating the proof for $epsilon>0$ is that our divering minorante for the harmonic series no longer divererges!

In essence it is just the difference that $sum_{i=0}^{infty}frac12$ diverges, while $sum_{i=0}^{infty}frac1{2^{1+iepsilon}}$ converges.

This insight allows you to actually prove that $sum_{n=0}^{infty}frac1{n^{1+epsilon}}$ converges without the integral methods mentioned in other answers.

That’s because

$$frac1{3^{1+epsilon}} + frac1{4^{1+epsilon}} le 2frac1{2^{1+epsilon}} = frac1{2^{epsilon}}, $$

$$frac1{5^{1+epsilon}} + frac1{6^{1+epsilon}} + frac1{7^{1+epsilon}} + frac1{8^{1+epsilon}}le 4frac1{4^{1+epsilon}} = frac1{2^{2epsilon}}, $$

$$frac1{9^{1+epsilon}} + frac1{10^{1+epsilon}} + frac1{11^{1+epsilon}} + frac1{12^{1+epsilon}} +frac1{13^{1+epsilon}} + frac1{14^{1+epsilon}} + frac1{15^{1+epsilon}} + frac1{16^{1+epsilon}} le 8frac1{8^{1+epsilon}} = frac1{2^{3epsilon}}, $$

a.s.o.

Now our series has a converging majorante $sum_{i=0}^{infty}frac1{2^{iepsilon}}$, so converges itself.

I will address the last paragraph of your post. As long as epsilon is a positive fixed quantity, the series will converge. This can be seen with the Integral test. $1/x$ integrates to $lnx$ and with $x$ going to infinity, the integral and thus the series diverges. But if the exponent is more than $1$, the polynomial term integrates to another polynomial term. I leave it up to you the figure out why that implies convergence because this essentially will answer your last part of the question. Lastly (and not unimportant!) a note on the word “fixed”. If the epsilon is not a fixed positive quantity, but variable, then the series may be divergent. For example the series $frac{1}{n^{(1+1/n)}}$ has a “variable” exponent, but the exponent is more than $1$. Yet this series turns out to be divergent.

Also, the integral test

can be used to show that

$sum dfrac1{nln ln … ln(n)}

$

diverges for

any fixed number of

nexted $ln$.

This is because,

if we define

$ln_0(n) = 1

$

and

$ln_{k+1}(n)

=ln(ln_k(n))

$,

then

$(ln_k(x))’

=dfrac1{xprod_{j=1}^{k-1}ln_{j}(x)}

$.

Proof.

$(ln_1(x))’

=(ln(x))’

=dfrac1{x}

$

and

$(ln_2(x))’

=(ln(ln(x)))’

=(ln(x))’dfrac1{ln(x)}

=dfrac1{xln(x)}

$

If

$(ln_k(x))’

=dfrac1{xprod_{j=1}^{k-1}ln_{j}(x)}

$,

then

$begin{array}\

(ln_{k+1}(x))’

&=(ln(ln_k(x)))’\

&=(ln_k(x))’dfrac1{ln_k(x)}\

&=dfrac1{xprod_{j=1}^{k-1}ln_{j}(x)ln_k(x)}\

&=dfrac1{xprod_{j=1}^{k}ln_{j}(x)}\

end{array}

$

Since

$=ln_k(x)

to infty$

as $x to infty$

for any fixed $k$,

$int dfrac{dx}{xprod_{j=1}^{k-1}ln_{j}(x)}

=ln_k(x)

to infty$

as $x to infty$

so

$sum dfrac1{nprod_{j=1}^{k-1}ln_{j}(n)}

$

diverges by the integral test.

You can similarly show that

$sum dfrac1{nprod_{j=1}^{k-1}ln_{j}(n)ln_{k}^{1+epsilon}(n)}

$

converges for any fixed $k$

and $epsilon > 0$.