Math Genius: Evaluating $lim_{x to frac{pi}{6}}{(1-2sin(x))}^{tan(frac{pi}{6}-x)}$

Original Source Link

I’m in a little struggle with this limit, can anyone help me, please?

$$lim_{x to frac{pi}{6}}{(1-2sin(x))}^{tan(frac{pi}{6}-x)}$$

I tried to use the logarithm to then use L’Hospital’s rule but I got stuck here:
$ln(L)=lim_{x to frac{pi}{6}}{[tan(frac{pi}{6}-x)ln(1-2sin(x))]}$

Thank you!

Let $f(x) = (1-2sin x)^{tan(frac{pi}{6}-x)}$, then $f(x) = e^{g(x)}$ with $g(x) = tan(frac{pi}{6}-x) log (1-2sin x)$.

$$begin{align}
limlimits_{x to frac{pi}{6}^- } g(x)
&= limlimits_{x to frac{pi}{6}^- } frac{tanleft(frac{pi}{6}-xright)}{frac{pi}{6}-x} left(frac{pi}{6}-xright)log left(1-2sin xright)
\
&overset{(1)}{=} limlimits_{x to frac{pi}{6}^- } left(frac{pi}{6}-xright)log left(1-2sin xright)
\
&=limlimits_{x to frac{pi}{6}^- } frac{ log (1-2sin x)}{frac{1}{frac{pi}{6}-x}}
\
&overset{mathrm{H}}{=} limlimits_{x to frac{pi}{6}^-} (-2cos x)frac{left(frac{pi}{6}-xright)^2}{1-2sin x}
\
&= -sqrt{3}limlimits_{x to frac{pi}{6}^-} frac{left(frac{pi}{6}-xright)^2}{1-2sin x}
\
&overset{mathrm{H}}{=}
-sqrt{3}limlimits_{x to frac{pi}{6}^-}frac{-2left(frac{pi}{6}-xright)}{-2cos x }
\
&= 0
end{align}$$

where in $(1)$ I have used $lim_{yto0} frac{tan y}{y} = 1$ and $H$ denotes the usage of L’HΓ΄pital’s rule.

Hence, we conclude that

$$limlimits_{x to frac{pi}{6}^-} f(x) = e^0 = 1.$$

In the “$log$” expression, expand $sin$ around $x_0=frac{pi}{6}$ using Taylor series, up to the second term, get
$$
sin x approx frac{1}{2} + frac{sqrt{3}}{2}left(x-frac{pi}{6}right)
$$

so the expression $log(1-2 sin x)$ becomes $logleft(frac{sqrt{3}}{2} left(frac{pi}{6} – xright)right) = log frac{sqrt{3}}{2} + log left(frac{pi}{6} – xright)$. Now set $t=frac{pi}{6} – x$, rewrite $-tan (-t_ = -frac{sin t}{cos t}$ and expand $sin t sim t $ for $t to 0^+$. This additional condition of convergence from the right allows rewriting the limit as

$$
lim_{t to 0^{+}} t log t
$$

Now you can rewrite $t log t = frac{log t }{frac{1}{t}}$, and note that $frac{1}{t} to infty$ and $log t to -infty$. Set $log t =v, frac{1}{t} = e^{-v}$ for $v to infty$ and obviously
$$
lim_{v to infty}frac{v }{e^v} = 0
$$

All other terms converge to constants and are easy to compute. Keep in mind also that the original expression is $varphi = e^{log varphi}$, so don’t forget to take the exponent.

Result: no L’Hopital Rule used, only Taylor Series Expansion!

Your job might be simpler if you substitute $pi/6-x=2t$. Then
$$
1-2sin x=1-2sin(pi/6-2t)=1-cos 2t+sqrt{3}sin 2t=2sin t(sin t+sqrt{3}cos t)
$$

Note that in order that the limit makes sense you need $sin x<1/2$, so $0<x<pi/6$ (the lower bound is mostly irrelevant, though), hence $t>0$.

How does this help? You get to evaluate the limit for $tto0$ of
$$
tan2tbigl(log(sin t)+log(2sin t+2sqrt{3}cos t)bigr)
$$

The part $tan2tlog(2sin t+2sqrt{3}cos t)$ poses no problem: its limit is $0$. Then you need to compute the limit of
$$
frac{2cos t}{cos2t}sin tlogsin t
$$

The fraction part has limit $2$. The part $sin tlogsin t$ has limit $0$, as it’s easy to show with l’HΓ΄pital or other methods.

Hence the limit is $0$. Therefore your original limit is $e^0=1$.

Tagged : /

Math Genius: Without using integration and a graphing calculator, plot the graph of $y=f(x)$, given that its derivative is $f'(x)=e^{-x^{2}}$ and $f(0)=0$.

Original Source Link

With my little understanding of calculus, I calculated $limlimits_{xto+infty} f'(x)=0$ and $limlimits_{xto-infty} f'(x)=0$. Based on this information, I guessed that the graph must flatten for extremely large(whether positive or negative) values of $x$. Further, $$f”(x)=-2xe^{-x^{2}}$$ From this, I deduced that for $x<0$ the slope is increasing, while for $x>0$, the slope is decreasing. Given that $f(0)=0$ and $f'(0)=1$, the graph passes through the origin. Based on all this information, I figured the graph looks like this original

The only part I failed to figure out is this:

How to calculate the value of horizontal asymptotes enclosing the graph? Can this be done without explicitly involving integration?

How to calculate the value of horizontal asymptotes enclosing the graph? Can this be done without explicitly involving integration?

In other words: evaluate $int_0^infty exp(-x^2);dx$ without integration. I assume “without integration” allows neither multiple integration nor contour integration and residue theory. The only way to do this that I can think of is: look it up.


Relating to the contour integration method:

Desbrow, Darrell.

On Evaluating $displaystyleint_{-infty}^infty e^{ax(x-2b)}, dx$ by Contour Integration Round a Parallelogram.

Amer. Math. Monthly $105, (1998)$, no. $8,, 726–731$.

DISCLAIMER: This is done by integration, so count it as “additional information” as said in the comments.

$$f'(x)=e^{-x^2}$$

$$int_0^{f(x)} ,d(f(x)) = int_0^x e^{-x^2},dx $$

$$f(x)=int_0^x e^{-x^2},dx$$

The function $y=e^{-x^2}$ should be familiar to you. It is the curve that defines the normal or gaussian distribution function.

<span class=$y=e^{-x^2}$”>

Let’s first look at the integral from $x=-infty$ to $infty$, rather than $x=0$ to $x$

In $mathbb{R}^3$ consider two curves: $z=e^{-x^2}$ and $z=e^{-y^2}$.

Both of these stand vertically upon the x-y plane, with their peaks pointing in the direction of the z-axis.

enter image description here

The two areas under these curves would be the same: $A=int_{-infty}^{infty} e^{-x^2},dx$ and $A=int_{-infty}^{infty} e^{-y^2},dy$

Multiplying the two will give the volume of a 3-Dimensional bell-shaped curve as shown above (the right hand sketch)

We get :

$$A^2=int_{-infty}^{infty} e^{-x^2},dx times int_{-infty}^{infty} e^{-y^2},dy$$

This gives a nested double integral:

$$A^2= int_{-infty}^{infty}int_{-infty}^{infty} e^{-(x^2+y^2)},dx,dy$$

Now we can shift to polar co-ordinates ($r,theta$), where r is polar radius vector of any point (x,y)

Let us view that 3-D bell curve as a solid of revolution created by rotating the curve $z=e^{-r^2}$, stretching from $r=0$ to $r=infty$ about the z-axis, through an angle equal to $2pi$ radians. We will have to change the limits of our double integrals accordingly.

We will also have to redefine that double integral from $𝑑𝑦$ & $𝑑π‘₯$ to $π‘‘π‘Ÿ$ & $π‘‘πœƒ$ , to adapt to a polar world. In the Cartesian World, the volume of a 3D solid is computed by adding the volumes of an infinite number of thin, vertical columns of square cross-section $,𝑑𝑦,𝑑π‘₯$. In the polar world, the same solid has to be imagined as being made of an infinite number of very tiny, concentric arc-shaped sections of radial thickness $π‘‘π‘Ÿ$ and arc-width $π‘Ÿπ‘‘πœƒ$

enter image description here

Our new equation will then be:-

$$A^2= int_0^{2pi} int_{0}^{infty} re^{-r^2} ,d{theta},dr$$

$$A^2=left( int_0^{2pi} ,d{theta} right) left( int_0^infty re^{-r^2},dr right)$$

$$A^2=(2pi) left( int_0^{infty} frac{1}{2} e^{-u},du right) $$

$$A^2=pi$$

$$A=sqrtpi$$

So, $$int_0^infty e^{-x^2},dx=sqrtpi$$

Now, to calculate the integral we require, let’s resume at equation $A^2=int_0^{2pi} int_0^infty re^{-r^2} ,dtheta ,dr$

Let us switch limits of $r$ to : $r=0$ to $r=x$

We get the volume of our 3D solid to be :

$$A^2=left( int_0^{2pi} ,d{theta} right) left( int_0^x re^{-r^2},dr right)$$

$$=(2pi)left( frac{1}{2} int_0^{x^2} e^{-u},du right)$$

$$A^2=pi times (1-e^{-x^2})$$

$$A=sqrt{pi (1-e^{-x^2})}$$

Now notice that the volume used here was the volume of our 3D bell shaped solid, but truncated at a radius of $x$. Something like:

enter image description here

So, the $A$ is actually $int_{-x}^x e^{-x^2},dx$

But by the circular symmetry, $int_0^x e^{-x^2},dx$ will be half of this $A$

Thus, finally:

$$int_0^{x} e^{-x^2} ,dx = frac{1}{2} sqrt{pi (1-e^{-x^2})} = f(x) $$

NOTE: this is only for $x>0$. For $x<0$ consider the negative part while taking the square root of $A^2$.

So the asymptotes can be found by applying the $lim_{x to pminfty}$ on $f(x)$

NOTE: for $x<0$, $f(x)=- frac{1}{2} sqrt{pi (1-e^{-x^2})}$

Tagged :

Math Genius: If a function is continuous on a closed interval, then it is bounded on that interval.

Original Source Link

So, the theorem I’m trying to prove is the following:

If $f$ is continuous on $[a,b]$, then $f$ is bounded on $[a,b]$.


Proof Attempt:

Let $f$ be continuous on $[a,b]$. Suppose that $f$ is not bounded on $[a,b]$. So, there does not exist an $M > 0$ such that:

$$|f(x)| leq M$$

for any $x in [a,b]$. Define a sequence ${x_n}$ such that all the terms of the sequence belong to $[a,b]$. Then, it is clear that there does not exist an $n in mathbb{N}$ such that:

$$|f(x_n)| leq n$$

By a previously proven result, there exists a $c in [a,b]$ such that every neighbourhood of $c$ contains infinitely many terms of the sequence. So, the function is unbounded in every neighbourhood of $c$. By another previously proven result, it follows that $lim_{x to c} f(x)$ does not exist. This contradicts the continuity of $f$ as asserted by the hypothesis. It follows that $f$ must be bounded on $[a,b]$. This proves the desired result.

Does the proof above work? If it doesn’t, why? How can I fix it?

Why is it β€œclear that there does not exist an $ninBbb N$ such that $|f(x_n)|leqslant n$”? Besides, what does this mean? Is it for every $n$ or for some $n$?

Suppose that $f$ is unbounded. For each $ninBbb N$, take $x_nin[a,b]$ such that $bigl|f(x_n)bigr|geqslant n$; such a $x_n$ must exist, since we are assuming that $f$ is unbounded. This sequence has a subsequence $(x_{n_k})_{kinBbb N}$ that converges to some $cin[a,b]$. Therefore, $lim_{ktoinfty}f(x_{n_k})=f(c)$. But this is impossible, since every subsequence of $bigl(f(x_n)bigr)_{ninBbb N}$ is unbounded and every convergent sequence is bounded.

Tagged : / /

Math Genius: Show $lim dfrac{ a_n }{n} $ exists if $0 leq a_{n+m} leq a_n + a_m $

Original Source Link

Assume that the terms of the sequence $(a_n)$ satisfy the conditions

$$ 0 leq a_{n+m} leq a_n + a_m ; ; ; text{for} ; ; n,m in mathbb{N} $$

Now prove that $lim_{n to infty} dfrac{a_n}{n} $ exists.

attempt to the solution:

First of all, clearly, the sequence $(a_n)$ is bounded below by $0$. Let $b_n = dfrac{a_n}{n}$. Since $dfrac{1}{n} > 0$, then $b_n > 0$ and $(b_n)$ is bounded. If we can prove that $(b_n)$ is monotonic, then we will solve the problem.

Notice that if we put $m=n$ in the condition we get

$$ a_{2n} leq 2 a_n implies dfrac{a_{2n} }{2n} leq dfrac{a_n}{n} implies b_{2n} leq b_n$$

Can we assume from here that $(b_n)$ is decreasing? Im not hundred percent on this step. Can someone tell me if I am on the right track to solve the problem?

Unfortunately, that approach does not work. It is correct that $b_{2n} le b_n$, and even $b_{kn} le b_n$ for all positive integers $n, k$. But that does not imply that $b_n = a_n/n$ is decreasing. A counterexample is
$$
a_n = leftlceil frac n2 rightrceil = 1, 1, 2, 2, 3, 3, 4, 4, ldots
$$

which satisfies $0 leq a_{n+m} leq a_n + a_m$, but
$$
b_n = frac{a_n}{n} = 1, frac 12, frac 23, frac 12, frac 35, frac 12, frac 47, frac 12, ldots
$$

decreases and increases alternatingly.

For a working proof see for example Prove $lim_{ntoinfty} frac{a_n}{n}$ exists for positive sequence where $a_{n+m} leq a_n + a_m$.

Tagged : / / /

Math Genius: Show $lim dfrac{ a_n }{n} $ exists if $0 leq a_{n+m} leq a_n + a_m $

Original Source Link

Assume that the terms of the sequence $(a_n)$ satisfy the conditions

$$ 0 leq a_{n+m} leq a_n + a_m ; ; ; text{for} ; ; n,m in mathbb{N} $$

Now prove that $lim_{n to infty} dfrac{a_n}{n} $ exists.

attempt to the solution:

First of all, clearly, the sequence $(a_n)$ is bounded below by $0$. Let $b_n = dfrac{a_n}{n}$. Since $dfrac{1}{n} > 0$, then $b_n > 0$ and $(b_n)$ is bounded. If we can prove that $(b_n)$ is monotonic, then we will solve the problem.

Notice that if we put $m=n$ in the condition we get

$$ a_{2n} leq 2 a_n implies dfrac{a_{2n} }{2n} leq dfrac{a_n}{n} implies b_{2n} leq b_n$$

Can we assume from here that $(b_n)$ is decreasing? Im not hundred percent on this step. Can someone tell me if I am on the right track to solve the problem?

Unfortunately, that approach does not work. It is correct that $b_{2n} le b_n$, and even $b_{kn} le b_n$ for all positive integers $n, k$. But that does not imply that $b_n = a_n/n$ is decreasing. A counterexample is
$$
a_n = leftlceil frac n2 rightrceil = 1, 1, 2, 2, 3, 3, 4, 4, ldots
$$

which satisfies $0 leq a_{n+m} leq a_n + a_m$, but
$$
b_n = frac{a_n}{n} = 1, frac 12, frac 23, frac 12, frac 35, frac 12, frac 47, frac 12, ldots
$$

decreases and increases alternatingly.

For a working proof see for example Prove $lim_{ntoinfty} frac{a_n}{n}$ exists for positive sequence where $a_{n+m} leq a_n + a_m$.

Tagged : / / /

Math Genius: Find the value of $sum _{n=1}^{infty }:frac{a}{nleft(n+aright)}$

Original Source Link

find the value of $sum _{n=1}^{infty }:frac{a}{nleft(n+aright)}$ $(a>0)$

I just can analyse $sum _{n=1}^{infty }:frac{a}{nleft(n+aright)}=aleft(frac{1}{1}-frac{1}{1+a}+frac{1}{2}-frac{1}{2+a}+frac{1}{3}-frac{1}{3+a}…+frac{1}{n}-frac{1}{n+a}right)$

Can anyone help me? Thanks

Let’s call the original sum $lim_{n to infty} V_n$.

Asymptotic solution for $V_n$ with $a>0$: the first sum is Harmonic, so it is $log n + O(1)$. The second sum is
$$
lim_{n to infty} sum_{k=1}^{n} frac{1}{k+a} = lim_{n to infty} S_n
$$

Each value in the (argument, value) tuple in this sum, $(1, frac{1}{1+a}), (2, frac{1}{2+a}) ldots (n, frac{1}{n+a})$ is in fact an area of a rectangle: $r_1 = (2-1) times frac{1}{1+a}, r_2= (3-2) times frac{1}{2+a} , ldots r_n = (n+1-n) times frac{1}{n+a}$, so the sum $S_n$ is equa; to the sum of areas of these rectangles.

Next step is to compare each $r_j$ to the function $f(x) = frac{1}{x+a}$. For each interval $[1,2], (2,3], ldots (n, n+1)$ area of $r_j$ upper-bounds integral of $f(x)$:

$$
r_j > int_{j}^{j+1} f(x)dx = log frac{j+1+a}{j+a}
$$

If we sum LHS and RHS of this inequality, we get the lower-bound on S_n:

$$
S_n > sum_{j=1}^{n} > sum_{j=1}^{n} log frac{j+1+a}{j+a} = log (n+a+1) – log (a+1)
$$

As a result, you get an upper bound on the original sum:

$$
V_n < H_n – log (n+a+1) + log (a+1) = log (a+1) + gamma + log (frac{n}{n+a+1}) = log (a+1) + gamma + O(frac{1}{n})
$$

EDIT: got the sign wrong the first time. Also $log frac{n}{n+a+1} = -log (1+frac{a+1}{n}) sim – frac{a+1}{n} = O(frac{1}{n})$

It is $psi (a + 1) + gamma$, where $psi$ is the logarithmic derivative of the gamma function and $gamma$ is the Euler-Mascheroni constant, cf. http://dlmf.nist.gov/5.7.E6 and http://dlmf.nist.gov/5.5.E2 Using this fact, it follows for example that
$$
log a + gamma + frac{1}{{2a}} – frac{1}{{12a^2 }} < sumlimits_{n = 1}^infty {frac{a}{{n(n + a)}}} < log a + gamma + frac{1}{{2a}}
$$

for all $a>0$ (see http://dlmf.nist.gov/5.11.ii). Also, for $-1<a<1$, it holds that
$$
sumlimits_{n = 1}^infty {frac{a}{{n(n + a)}}} = sumlimits_{k = 2}^infty {( – 1)^k zeta (k)a^{k – 1} } ,
$$

where $zeta$ denotes Riemann’s zeta function (see http://dlmf.nist.gov/5.7.E4).

This answers if $a epsilon Z^+$ (Scroll down for Real Positive a),

These type of problems are known as Telescopic Sums,

You may easily find a good lot of these here,

$$frac {(n+a)-(n)}{(n)(n+a)} = frac 1n – frac 1{n+a}$$

So in the given sum, since a is an integer, Sum can be rewritten as:

$$sum lbrace frac {a}{(an+a)(an)} +frac {a}{(an+a+1)(an+1)} … frac {a}{(an+2a -1)(an+a-1)}rbrace$$

In other words all the terms at positions that give same remainder with a are taken into groupings for convienience.

Within a grouping,

$$lbrace frac 1{(k)} – frac 1{(a+k)} rbrace + lbrace frac 1{(a+k)} – frac 1{(2a+k)} rbrace …$$

Last term will tend to 0,

So finally,

$$text {given sum} = sum _{k=1} ^a frac 1{k}$$

If $a epsilon R^+$,

You may solve this using something called the Riemann Sums which a huge back story and is concerned with how the concept of an integral developed.

$$lim _{n rightarrow infty}sum _0 ^{n} frac {b-a}{n} fleft( a+frac {k(b-a)}{n}right) = int _a ^b f(x) dx$$

$$text {Given Sum} = lim _{N rightarrow infty} lim _{n rightarrow infty} sum _1 ^n frac {N-0}{n} frac {a}{left( k frac Nnright) left( k frac Nn + aright)}$$ $$= sum _0 ^n frac {N}{n} frac {a}{left( (k+1) frac Nnright) left( (k+1) frac Nn + aright)}$$

$$= int _0 ^infty frac {a}{(x+1)(x+1+a)} dx$$

Once again you may telescope like I did in the integer one to get that this equals the below integral

$$int _0 ^a frac 1{x+1} dx$$ Which is also equal to the more general solution to the problem discussed in Gary’s answer $psi^{(0)} (a+1) + gamma$ (Result for integer a case was equal to this general form too)

Tagged : / /

Math Genius: Find the value of $sum _{n=1}^{infty }:frac{a}{nleft(n+aright)}$

Original Source Link

find the value of $sum _{n=1}^{infty }:frac{a}{nleft(n+aright)}$ $(a>0)$

I just can analyse $sum _{n=1}^{infty }:frac{a}{nleft(n+aright)}=aleft(frac{1}{1}-frac{1}{1+a}+frac{1}{2}-frac{1}{2+a}+frac{1}{3}-frac{1}{3+a}…+frac{1}{n}-frac{1}{n+a}right)$

Can anyone help me? Thanks

Let’s call the original sum $lim_{n to infty} V_n$.

Asymptotic solution for $V_n$ with $a>0$: the first sum is Harmonic, so it is $log n + O(1)$. The second sum is
$$
lim_{n to infty} sum_{k=1}^{n} frac{1}{k+a} = lim_{n to infty} S_n
$$

Each value in the (argument, value) tuple in this sum, $(1, frac{1}{1+a}), (2, frac{1}{2+a}) ldots (n, frac{1}{n+a})$ is in fact an area of a rectangle: $r_1 = (2-1) times frac{1}{1+a}, r_2= (3-2) times frac{1}{2+a} , ldots r_n = (n+1-n) times frac{1}{n+a}$, so the sum $S_n$ is equa; to the sum of areas of these rectangles.

Next step is to compare each $r_j$ to the function $f(x) = frac{1}{x+a}$. For each interval $[1,2], (2,3], ldots (n, n+1)$ area of $r_j$ upper-bounds integral of $f(x)$:

$$
r_j > int_{j}^{j+1} f(x)dx = log frac{j+1+a}{j+a}
$$

If we sum LHS and RHS of this inequality, we get the lower-bound on S_n:

$$
S_n > sum_{j=1}^{n} > sum_{j=1}^{n} log frac{j+1+a}{j+a} = log (n+a+1) – log (a+1)
$$

As a result, you get an upper bound on the original sum:

$$
V_n < H_n – log (n+a+1) + log (a+1) = log (a+1) + gamma + log (frac{n}{n+a+1}) = log (a+1) + gamma + O(frac{1}{n})
$$

EDIT: got the sign wrong the first time. Also $log frac{n}{n+a+1} = -log (1+frac{a+1}{n}) sim – frac{a+1}{n} = O(frac{1}{n})$

It is $psi (a + 1) + gamma$, where $psi$ is the logarithmic derivative of the gamma function and $gamma$ is the Euler-Mascheroni constant, cf. http://dlmf.nist.gov/5.7.E6 and http://dlmf.nist.gov/5.5.E2 Using this fact, it follows for example that
$$
log a + gamma + frac{1}{{2a}} – frac{1}{{12a^2 }} < sumlimits_{n = 1}^infty {frac{a}{{n(n + a)}}} < log a + gamma + frac{1}{{2a}}
$$

for all $a>0$ (see http://dlmf.nist.gov/5.11.ii). Also, for $-1<a<1$, it holds that
$$
sumlimits_{n = 1}^infty {frac{a}{{n(n + a)}}} = sumlimits_{k = 2}^infty {( – 1)^k zeta (k)a^{k – 1} } ,
$$

where $zeta$ denotes Riemann’s zeta function (see http://dlmf.nist.gov/5.7.E4).

This answers if $a epsilon Z^+$ (Scroll down for Real Positive a),

These type of problems are known as Telescopic Sums,

You may easily find a good lot of these here,

$$frac {(n+a)-(n)}{(n)(n+a)} = frac 1n – frac 1{n+a}$$

So in the given sum, since a is an integer, Sum can be rewritten as:

$$sum lbrace frac {a}{(an+a)(an)} +frac {a}{(an+a+1)(an+1)} … frac {a}{(an+2a -1)(an+a-1)}rbrace$$

In other words all the terms at positions that give same remainder with a are taken into groupings for convienience.

Within a grouping,

$$lbrace frac 1{(k)} – frac 1{(a+k)} rbrace + lbrace frac 1{(a+k)} – frac 1{(2a+k)} rbrace …$$

Last term will tend to 0,

So finally,

$$text {given sum} = sum _{k=1} ^a frac 1{k}$$

If $a epsilon R^+$,

You may solve this using something called the Riemann Sums which a huge back story and is concerned with how the concept of an integral developed.

$$lim _{n rightarrow infty}sum _0 ^{n} frac {b-a}{n} fleft( a+frac {k(b-a)}{n}right) = int _a ^b f(x) dx$$

$$text {Given Sum} = lim _{N rightarrow infty} lim _{n rightarrow infty} sum _1 ^n frac {N-0}{n} frac {a}{left( k frac Nnright) left( k frac Nn + aright)}$$ $$= sum _0 ^n frac {N}{n} frac {a}{left( (k+1) frac Nnright) left( (k+1) frac Nn + aright)}$$

$$= int _0 ^infty frac {a}{(x+1)(x+1+a)} dx$$

Once again you may telescope like I did in the integer one to get that this equals the below integral

$$int _0 ^a frac 1{x+1} dx$$ Which is also equal to the more general solution to the problem discussed in Gary’s answer $psi^{(0)} (a+1) + gamma$ (Result for integer a case was equal to this general form too)

Tagged : / /

Math Genius: Contour integration appears to be wrong. Why?

Original Source Link

I want to solve the following integral using contour integration:

$$I = int_0^{infty} frac{e^{i x}}{x^2 +1} dx. tag{1}$$

I built a contour consisting of three paths: $A$ (going from 0 to $infty$), $B$ (going from $-infty$ to 0 ) and $C$ the upper semicircle.

I define

$$f(z) = frac{e^{i |z|}}{z^2+1}, tag{2}$$

where $|cdot|$ stands for the modulus of a complex number.

I have that

$$oint f(z) dz = left( int_A + int_B + int_C right) f(z) dz. tag{3}$$

Now,

$$ int_{A} f(z) dz = int_0^{infty} f(x) dx = I, tag{4}$$
since $|x| = x$ if $x geq 0$.

Likewise,

$$ int_{B} f(z) dz = int_{-infty}^0 f(x) dx = – int_{infty}^0 f(-x) dx = int_0^{infty} frac{e^{i |- x|}}{(-x)^2+1} = int_0^{infty} frac{e^{i x}}{x^2+1} = I. tag{5}$$

The integral over the contour $C$ is 0 when the semicircle is “at” $infty$.

I evaluate the l.h.s of Eq. $(3)$ using Cauchy’s theorem, and I find

$$oint f(z) dz = 2 pi i frac{e^{i |i|}}{2 i} = pi e^i. tag{6}$$

Putting all together I find that

$$I = frac{pi}{2} e^i approx 0.848705 + 1.32178 i, tag{7}$$

whereas Mathematica says that

$$I approx 0.577864 + 0.646761 i. tag{8}$$

What is wrong with my reasoning?

Tagged : / /

Math Genius: Interval of p where $int_0^infty{sqrt{x}sin(frac{1}{x^p})}dx$ converges

Original Source Link

$$int_0^infty{sqrt{x}sin{left(frac{1}{x^p}right)}}dx$$

In my attempt I used the small angle approximation of $sin{x}$. I stated that as x approaches infinity $frac{1}{x^p}$ approaches 0, so $sin{left(frac{1}{x^p}right)} = frac{1}{x^p}$ and simplified the equation to $$int_0^infty{sqrt{x}cdotfrac{1}{x^p}}dx$$ getting $$int_0^infty{frac{1}{x^{p-frac{1}{2}}}}dx$$. I stated that for the integral to converge, $p-frac{1}{2}>1$ because of how the sum for p-series needs to meet this requirement.

Am I doing this correctly?

Your answer is correct but there is a flaw in the argument. $int_0^{infty} frac 1{x^{p-frac 1 2}}dx=infty$ for all $p$. What you have to do is to split the integral into integrals from $0$ to $1$ and $1$ to $infty$. The first integral converges for all $p$. The second integral converges iff $p >frac 1 2$ by your argument.

Tagged : / / /

Math Genius: Interval of p where $int_0^infty{sqrt{x}sin(frac{1}{x^p})}dx$ converges

Original Source Link

$$int_0^infty{sqrt{x}sin{left(frac{1}{x^p}right)}}dx$$

In my attempt I used the small angle approximation of $sin{x}$. I stated that as x approaches infinity $frac{1}{x^p}$ approaches 0, so $sin{left(frac{1}{x^p}right)} = frac{1}{x^p}$ and simplified the equation to $$int_0^infty{sqrt{x}cdotfrac{1}{x^p}}dx$$ getting $$int_0^infty{frac{1}{x^{p-frac{1}{2}}}}dx$$. I stated that for the integral to converge, $p-frac{1}{2}>1$ because of how the sum for p-series needs to meet this requirement.

Am I doing this correctly?

Your answer is correct but there is a flaw in the argument. $int_0^{infty} frac 1{x^{p-frac 1 2}}dx=infty$ for all $p$. What you have to do is to split the integral into integrals from $0$ to $1$ and $1$ to $infty$. The first integral converges for all $p$. The second integral converges iff $p >frac 1 2$ by your argument.

Tagged : / / /