The intended cute solution

The question I meant to ask on Sunday was whether $\sum_{m,n=1}^\infty1/(m^2+n^2)^2$ converges or diverges, so I’ll give my solution to that problem now. In fact, the methodology closely resembles the divergence proof I gave for the alternate sum, even though this sum converges.

Since all terms in the sum are positive, the sum either converges absolutely or diverges (the sum cannot be conditionally convergent). In either case the terms of the sum may be rearranged without affecting the convergence. For suppose the sum diverges but converges for some rearrangement: since the terms of the rearranged sum are still positive, the rearranged sum would have to converge absolutely, and therefore converge for all rearrangements, contradicting the supposed diverging arrangement.

Therefore we can rearrange the terms as we please; in particular, we can sort the terms in decreasing order, which has the effect of grouping together all terms of the form $1/k^2$ where $k$ is a sum of two squares. The term $1/k^2$ will appear in the sum once for each solution of $m^2+n^2=k$ in positive integers $m$, $n$.1

For our purposes it is sufficient to note there are at most $\sqrt{k}$ solutions to $m^2+n^2=k$. This follows since we know that $1\leq m\leq\sqrt{k}$ and for each value of $m$ there is at most one solution; the only possible value of $n$ which could work is $\sqrt{k-m^2}$.

The argument proceeds as follows:

\[ \newcommand{\N}{\mathbb{N}}\begin{align*}
\sum_{m,n\in\N}\frac{1}{(m^2+n^2)^2} &= \sum_{k=2}^\infty\sum_{\substack{m,n\in\N\\m^2+n^2=k}}\frac{1}{k^2} \\
&\leq \sum_{k=2}^\infty\frac{\sqrt{k}}{k^2} \\
&= \sum_{k=2}^\infty\frac{1}{k^{3/2}} \\
&= \zeta(3/2)-1
\end{align*} \]

The final sum converges since it is a $p$-series (truncated). By the comparison test, the sum in question also converges.

  1. The exact number of solutions to $m^2+n^2=k$ is essentially the sum of squares function $r_2(k)$, although since we are exclusively working with solutions in positive numbers the actual number of solutions will be $r_2(k)/4$ if $k$ is not a perfect square and $(r_2(k)-4)/4$ if $k$ is a perfect square.

    Via MathWorld, we see that if $r_2(k)$ is nonzero then it is equal to $4B(k)$, where $B(k)$ is the number of divisors of $k$ solely comprised of primes congruent to $1$ mod $4$. In any case, this shows that the maximum number of times $1/k^2$ can appear in the summation is $d(k)$, the number of divisors of $k$. In Apostol (page 296) it is shown that $d(k)=o(k^\epsilon)$ for any $\epsilon>0$, so this is a fairly slowly growing function.

Another cute solution

Previously I asked if the summation $\sum_{m,n=1}^\infty1/(m^2+n^2)$ converges or diverges. Actually, the I intended the denominator of the summation term to be $(m^2+n^2)^2$. But never mind, let’s solve the problem as given. To do this, we’ll employ the comparison test.

First, note that

\[ m^2+n^2 \leq m^2+2mn+n^2 = (m+n)^2 , \]

so $1/(m^2+n^2)\geq1/(m+n)^2$.

Next, assume the sum converges; since all terms are positive the sum absolutely converges and the terms may be rearranged without affecting its value. In particular, we rearrange the terms in decreasing order, by grouping all terms equal to $1/k^2$ together for each possible value of $k$.

Finally, we use the fact that there are exactly $k-1$ solutions to $m+n=k$ in positive integers $m$, $n$. Putting it all together, the argument goes as follows:

\begin{align*}
\newcommand{\N}{\mathbb{N}}
\sum_{m,n\in\N}\frac{1}{m^2+n^2} &\geq \sum_{m,n\in\N}\frac{1}{(m+n)^2} \\
&= \sum_{k=2}^\infty\sum_{\substack{m,n\in\N\\m+n=k}}\frac{1}{k^2} \\
&= \sum_{k=2}^\infty\frac{k-1}{k^2} \\
&\geq \frac{1}{2}\sum_{k=2}^\infty\frac{1}{k} \\
&= \infty
\end{align*}

The final inequality just uses $k-1\geq k/2$ for $k\geq2$ and we are left with a harmonic series, which diverges. By the comparison test the original sum diverges; this contradicts the assumption that it converges, so the sum really does diverge, as required.

Next up, I’ll show the question I intended to ask: does $\sum_{m,n=1}^\infty1/(m^2+n^2)^2$ converge or diverge?

Polynomial problem answer

A month ago I posed the problem of showing that a monic $\newcommand{\Z}{\mathbb{Z}}f\in\Z[x]$ is squarefree over $\newcommand{\C}{\mathbb{C}}\C$ if and only if it is squarefree over $\Z$.

One direction is straightforward, although the easiest way to see it is to consider the contrapositive. If $f$ is not squarefree over $\Z$ then its factorization over $\Z$ is of the form $hg^2$ where $h$, $g\in\Z[x]$ and $g$ is nonconstant. Since $f$ can only factor farther in $\C$ it follows that $f$ is also not squarefree over $\C$; in particular any root $\alpha\in\C$ of $g$ will have multiplicity at least $2$ in the factorization of $f$ in $\C$.

Thus if $f$ is squarefree over $\C$ it is also squarefree over $\Z$. Conversely, if $f$ is not squarefree over $\C$ then it has some multiple root $\alpha\in\C$ and its factorization is of the form $k\cdot(x-\alpha)^2$, so $f’=k'(x-\alpha)^2+2(x-\alpha)k$ and $\alpha$ is also a root of $f’$.

Since $f$ and $f’$ are polynomials with integer coefficients with $\alpha$ as a root, the minimal polynomial $g$ of $\alpha$ over $\newcommand{\Q}{\mathbb{Q}}\Q$ divides both $f$ and $f’$. In particular, there must be some $h\in\Q[x]$ such that $f=gh$. Then $f’=g’h+h’g$ and since $g\mid f’$ and $g\mid h’g$ we must have $g\mid g’h$. However, $g\nmid g’$ since $g’$ has a smaller degree than $g$ and is $g’$ is nonzero (as the characteristic of $\Q$ is $0$).

Being a minimal polynomial, $g$ is irreducible, and it is also prime as $\Q[x]$ is a UFD. Thus from $g\mid g’h$ and $g\nmid g’$ we must have $g\mid h$, i.e., $g^2\mid f$ and $f$ is not squarefree over $\Q$. By Gauss’ Lemma the factorization $f=g^2(h/g)$ over $\Q$ may actually be taken to be over $\Z$ (by replacing $g$ and $h$ with their primitive parts), so $f$ is not squarefree over $\Z$, as required.

Note that the requirement that $f$ be monic is necessary (at least, the content of $f$ should be squarefree). For example, if $f:=4$ then $f=2\cdot2$ is not squarefree over $\Z$, but is technically squarefree over $\C$, since $4$ is a unit in $\C$!

Quick polynomial problem

Here’s an interesting polynomial property that arose in my recent number theory project.

Let $f$ be a monic polynomial with integer coefficients. Show that $f$ is squarefree over $\mathbb{C}$ if and only if $f$ is squarefree over $\mathbb{Z}$.

A weak form of the prime number theorem

The function $\pi(n)$ counts the number of primes $p$ in the range $1\leq p\leq n$. One of the most spectacular theorems in mathematics is the so-called prime number theorem, which states that

\[ \lim_{n\to\infty}\Bigl(\pi(n)\Bigm/\frac{n}{\ln n}\Bigr) = 1 . \]

This is a strong condition on the asymptotic behaviour of $\pi(n)$ as $n$ increases without bound—unlike some asymptotic results, even the base of the logarithm is significant here.

Last year I worked through a proof of the prime number theorem. Two of the main results used were Euler’s product and Cauchy’s residue theorem—it was breathtaking. It still hasn’t completely “coalesced” in my head, but I periodically revisit it. Today I want to prove a weaker form of the prime number theorem, namely that

\[ \pi(n) = \Theta\Bigl(\frac{n}{\ln n}\Bigr) . \]

This uses asymptotic notation, and says that the growth rate of $\pi(n)$ is eventually within a constant factor of $n/\ln n$.

The proof is by Chebyshev and proceeds by finding the following bounds on the central binomial coefficient $\binom{2n}{n}$:

\begin{align}
2^n \leq \binom{2n}{n} &\leq 4^n \tag{1} \\
n^{\pi(2n)-\pi(n)} \leq \binom{2n}{n} &\leq (2n)^{\pi(2n)} \tag{2}
\end{align}

Upper bound of (1)

Using the binomial theorem to expand $(1+1)^{2n}$ gives

\[ 4^n = 2^{2n} = (1+1)^{2n} = \sum_{i=0}^{2n}\binom{2n}{i} \geq \binom{2n}{n} \]

since all the terms in the sum are positive.

Lower bound of (1)

However, note that the central term ($i=n$) in the sum is the largest (to go from term $i$ to term $i+1$ you multiply by $\frac{2n-i}{i+1}$, which is greater than $1$ until $i\geq n$), so we have:

\[ 4^n = 2^{2n} = (1+1)^{2n} = \sum_{i=0}^{2n}\binom{2n}{i} \leq (2n+1)\binom{2n}{n} \]

The required inequality follows after dividing by $2^n$ and using the fact that $2n+1\leq2^n$ (which holds for $n\geq3$, however the required inequality may explicitly be checked to hold for $n<3$).

Lower bound of (2)

Examining the prime factorization of $\binom{2n}{n}=(2n)!/(n!)^2$, we see that it contains all the primes $p$ in the range $n<p\leq 2n$, since they appear in the numerator but not in the denominator. There are $\pi(2n)-\pi(n)$ primes in this range, and they are all larger than $n$, so it follows that:

\[ \binom{2n}{n} \geq \prod_{n<p\leq 2n} p \geq n^{\pi(2n)-\pi(n)} \]

Upper bound of (2)

Note that there are $\lfloor n/p\rfloor$ multiples of $p$ in the range $1\leq p\leq n$. We can use this fact to determine the exponent on $p$ in the prime factorization of $n!$. There will be $\lfloor n/p\rfloor$ factors of $p$ in $n!$, but we also need to count the additional $\lfloor n/p^2\rfloor$ factors of $p$ which are contributed by multiples of $p^2$, and in general an additional $\lfloor n/p^k\rfloor$ factors of $p$ contributed by multiples of $p^k$. The exponent of $p$ in the prime factorization of $n!$ is thus

\[ \sum_{k=1}^\infty \biggl\lfloor\frac{n}{p^k}\biggr\rfloor , \]

where the sum is zero for $k>\log_p n$. It follows that the exponent of $p$ in the prime factorization of $(2n)!/(n!)^2$ is

\[ \sum_{k=1}^{\lfloor\log_p(2n)\rfloor} \biggl(\biggl\lfloor\frac{2n}{p^k}\biggr\rfloor – 2\biggl\lfloor\frac{n}{p^k}\biggr\rfloor\biggr) . \]

An interesting fact about the function $\lfloor 2x\rfloor-2\lfloor x\rfloor$ is that it is either $0$ or $1$ (plot it and you’ll see why), so this sum is actually at most $\lfloor\log_p(2n)\rfloor$. Since $\binom{2n}{n}$ only contains primes $p$ in the range $1\leq p\leq 2n$, it follows that:

\[ \binom{2n}{n} \leq \prod_{1\leq p\leq 2n}p^{\log_p(2n)} = \prod_{1\leq p\leq 2n} 2n = (2n)^{\pi(2n)} \]

Using the bounds

Putting both (1) and (2) together and taking logarithms yields the following bounds:

\begin{gather}
n\ln 2 \leq \pi(2n)\ln(2n) \tag{3} \\
(\pi(2n)-\pi(n))\ln n \leq n\ln 4 \tag{4}
\end{gather}

The lower bound on $\pi(n)$

From (3) we have that

\[ \pi(2n) \geq \frac{\ln2}{2}\frac{2n}{\ln(2n)} , \]

which is shows the required lower bound for even numbers. But it can be slightly modified to work for odd numbers as well (note that $\ln(2n)\leq\ln(2n+1)$):

\[ \pi(2n+1) \geq \pi(2n) \geq \frac{2n+1}{2n+1}\frac{\ln2}{2}\frac{2n}{\ln(2n)} \geq \frac{2n}{2n+1}\frac{\ln2}{2}\frac{2n+1}{\ln(2n+1)} \]

Since $2n/(2n+1)$ is lower bounded by a constant (e.g., $2/3$) for all positive $n$, this shows that $\pi(n)=\Omega(n/\ln n)$.

The upper bound on $\pi(n)$

First we will show the required upper bound for powers of two. That way we will be able to apply (4) not only on $n$ but also recursively on half of $n$ until $n=1$ is reached. However, to get a nice telescoping sum we modify (4) by subtracting $\pi(n)\ln(n/2)$ from both sides. After rearranging and simplifying with $\ln n-\ln(n/2)=\ln2$, we find

\[ \pi(2n)\ln n – \pi(n)\ln(n/2) \leq n\ln4 + \pi(n)\ln 2 \leq (3\ln 2)n \tag{5} \]

since $\pi(n)\leq n$ and $\ln4+\ln2=3\ln2$.

We now sum together (5) used on $n{,}~n/2{,}~\dotsc{,}~1$ to get

\[ \sum_{i=0}^{\log_2 n}\biggl(\pi\Bigl(\frac{n}{2^{i-1}}\Bigr)\ln\Bigl(\frac{n}{2^i}\Bigr)-\pi\Bigl(\frac{n}{2^i}\Bigr)\ln\Bigl(\frac{n}{2^{i+1}}\Bigr)\biggr) \leq (3\ln 2)\sum_{i=0}^{\log_2 n}\frac{n}{2^i} . \]

However, by telescoping the left sum is equal to $\pi(2n)\ln n$ and by the formula for the summation of a geometric series the right sum is at most $(3\ln2)2n$. In other words, when $n$ is a power of two we have

\[ \pi(2n) \leq (6\ln2)\frac{n}{\ln n} , \]

and the required bound follows using $\frac{n}{\ln n}<\frac{2n}{\ln(2n)}$, a consequence of the fact that $\frac{n}{\ln n}$ is an increasing function (for $n\geq3$, although the required bound still holds for $n<3$).

As before, a slight modification shows the bound holds for all numbers, not just powers of two. Let $m$ be an arbitrary integer in the range $n\leq m<2n$ where $n$ is a power of two. Then

\[ \pi(m) \leq \pi(2n) \leq (6\ln 2)\frac{n}{\ln n} \leq (6\ln 2)\frac{m}{\ln m} \]

which shows that $\pi(n)=O(n/\ln n)$.

Minimal polynomial misconception

Yesterday I had a discussion with a friend about computing minimal polynomials. For example, say you are given algebraic numbers $\alpha$ and $\beta$ as specified by minimal polynomials which have degrees $n$ and $m$. How do you compute the minimal polynomial of $\alpha+\beta$, for example?

The method I was taught, which seems to be fairly standard (e.g., see Dummit and Foote Chapter 13.2) is the following:

Multiply $\alpha+\beta$ by $\alpha^i\beta^j$ for each $i=0{,}~1{,}~\dotsc{,}~n-1$ and $j=0{,}~1{,}~\dotsc{,}~m-1$, and use the minimal polynomials of $\alpha$ and $\beta$ to reduce the resulting expressions to linear combinations of $\alpha^i\beta^j$ (again with $0\leq i<n$ and $0\leq j<m$). In other words, what you are doing is computing a matrix $M$ which satisfies

\[ (\alpha+\beta)\left[\alpha^i\beta^j\right]_{i,j} = M\left[\alpha^i\beta^j\right]_{i,j} \]

where $[\alpha^i\beta^j]_{i,j}$ is a column vector containing the $\alpha^i\beta^j$.

From this we see that $\alpha+\beta$ is an eigenvalue of $M$ and therefore is a root of the characteristic polynomial $p_M$ of $M$. If $p_M$ is irreducible then it will be the minimal polynomial of $\alpha+\beta$, but in general the minimal polynomial will divide $p_M$, and so it will be necessary to factor $p_M$.

However, once $p_M$ has been factored, how does one tell which factor is the required minimal polynomial? The obvious answer is to evaluate each factor at $\alpha+\beta$ and see which one gives zero. “You could do that numerically”, my friend says, and I respond with “…or symbolically”. But then he asks if I will always be able to determine if the symbolic expression is zero or not. Well, I hadn’t thought of that, and I admitted I wasn’t sure, but claimed “anything you can do numerically I can do symbolically!”

I spent several hours yesterday trying to solve that problem, but eventually had to go to bed. This morning, after looking in Cohen I found the following passage:

…it does not make sense to ask which of the irreducible factors $\alpha+\beta$ is a root of, if we do not specify embeddings in $\mathbb{C}$…

Wait, what? I got excited as I realized I had just uncovered a misconception of mine!

Note that if the conjugates of $\alpha$ are $\alpha_i$ then they are all “symbolically identical”: $\mathbb{Q}(\alpha_i)$ is isomorphic for each conjugate. From that, I had assumed that the values of $\alpha_i+\beta_j$ would also be symbolically identical for all conjugates of $\alpha$ and $\beta$. Not true! As a trivial counterexample, if $\alpha$ is a root of $x^3-2$ and $\beta$ is a root of $x^3+2$ then two possible values for $\alpha+\beta$ are $0$ and $\sqrt[3]{2}\sqrt{3}i$, and these have very different algebraic behaviour.

So the whole problem of computing the minimal polynomial of $\alpha+\beta$ was not well defined unless you specify which roots $\alpha$, $\beta$ you are talking about—for example, by specifying them numerically.

Leaving my head

The TED website hosts a large collection of inspiring talks on many different topics. The talk on how schools kill creativity is particularly fantastic; both funny and insightful, and is in fact the most-watched TED talk to date. During it Ken Robinson makes the following remark:

There’s something curious about professors in my experience — not all of them, but typically — they live in their heads.

Not to put too fine a point on it, this describes me perfectly. (Although I am a graduate student rather than a professor.)

This behaviour has both its strengths and weaknesses. For example, I have almost never felt lonely in my life, despite being alone a good portion of the time. Perhaps this is because by living in my head I am able to keep myself company. Indeed, my modus operandi when thinking about something could be described as a “conversation” with myself in which I try to make arguments for both sides of an issue.

However, like all human emotions, loneliness undoubtedly exists for a reason. In this case, it seems to be a trigger to make friends, which in turn improves one’s well-being. Without loneliness one has less short-term pain, but also less motivation to seek out ultimately beneficial connections.

In my case, I never gave much thought to the downsides, or even viewed them with a kind of perverse pride; the “cost of doing business” as it were. In short, I didn’t make much of an effort to ameliorate the absent-mindedness; in fact, I thought such traits were essentially unchangeable.

Early this year I decided to make a conscious effort to get out of my head. There were multiple reasons, but one is that almost every job I do involves interacting with other people in some way, so I figure that becoming more empathetic will actually improve the quality of my work.

How does one go about venturing out of their head? Well, I am still learning that. One step I’m taking is to learn—and remember!—the names of people I recognize from somewhere. I haven’t had a 100% success rate, but I’m already much better than I was previously, when I would be likely to forget someone’s name before my conversation with them had finished. Additionally, I will not ignore the fact that yes, I have a body, and to this end will pay more attention to things like exercise, diet, and appearance.

Pokémon Yellow is Turing complete

If you’re anything like me, this will simultaneously shock you, warm your heart, and leave you laughing at its convoluted brilliance.

The video is about 13 minutes long, but the payload comes in the last 30 seconds, where balloons are displayed on screen while music plays in the background. What’s so special about that? The image and music featured do not exist anywhere in the game—they were manually programmed to appear in it by taking advantage of game bugs shown in the first 12 minutes of the video. Assuming you know how to program in the gameboy’s machine language, you can turn Pokémon into any program you want.

The video itself is somewhat tedious to watch, because setting up the “bootstrap” program which allows one to write arbitrary programs was accomplished by acquiring a specific sequence of items in exactly the right quantities. For example, about two full minutes are spent doing nothing but buying lemonade (which can only be purchased one at a time). In addition, for much of the video it is difficult to determine just what’s going on; one gets the impression that the game itself is similarly confused!

For more detail, see this post by the author. My hat’s off to you, Robert McIntyre.