## At the Mountains of Gauss

This is a pretty silly waste of time and computational power, but I wanted to try doing this for a very long time: I finally got around to making some “fractal” landscapes.

The idea is that you if you generate a lot of gaussian bumps with randomly distributed parameters and add them all together, you will get the surface that looks like some natural landscape – a mountain, perhaps or an island, and then by playing around with the parameters of the distribution you can make your landscape look more or less rugged.

So, I started with a plain, added some bumps, and got this:

This doesn’t look very natural, does it? But then I added several more layers of smaller bumps, and created The Jelly Mountain:

Jelly Mountain looks pretty realistic, especially if you are willing to look past the hideousness of the color scheme. When I got slightly better at choosing parameters, I created The Candy Archipelago:

This has no theoretical importance whatsoever, but it’s still fun making something life-like out of equations and matrices.

Here’s some of my code for the curious and the masochistic:

## Cubed once again

Weird fact of the day: Suppose $x_1, ..., x_k \geq 0$, then:

$Lim_{n\rightarrow\infty}(\sum_{j=1}^{k} x_{j}^{n} )^{1/n}=max \lbrace x_1, ..., x_k \rbrace$

What does it really mean? It means: if you take some positive numbers $x_1, ..., x_k$, raise each to the power of n, add them all together, take the n-th root of the sum, then if you start to increase n the result will start to approach the greatest of the numbers you started with.

Why is that weird? Because the procedure is symmetric with respect to all the numbers $x_1, ..., x_k$, and the result depends only on one of them, the greatest one (Let’s call it $x_\ast$). All the other k-1 numbers are pretty much completely irrelevant. You can set them all to zero and it will not change the outcome, because all the work is done by $x_\ast$, and by $x_\ast$ alone. How does the procedure “determine” which number is the greatest?

How do we prove this amazing fact? Like this:

We already defined $x_\ast:=max \lbrace x_1, ..., x_k \rbrace$. Now:

$(x_\ast^{n} )^{1/n} \leq (\sum_{j=1}^{k} x_{j}^{n} )^{1/n} \leq (k \cdot x_\ast^{n} )^{1/n}$ and

$Lim_{n\rightarrow\infty}(x_\ast^{n})^{1/n}=Lim_{n\rightarrow\infty}(k\cdot x_\ast^{n} )^{1/n}=x_\ast$, therefore $Lim_{n\rightarrow\infty}(\sum_{j=1}^{k} x_{j}^{n} )^{1/n}=x_\ast$, the end.

Does it work with other families of functions $\varphi_n(x)$ or only with powers and roots? The proof only uses some monotonicity and the fact that $Lim_{n\rightarrow\infty} \varphi_n^{-1}(k\cdot \varphi_n(x))=x$, so it should also work, for example, when $\varphi_n(x)=n^x$, meaning that:

$Lim_{n\rightarrow\infty}(log_n\sum_{j=1}^{k} n^x_{j} )=max \lbrace x_1, ..., x_k \rbrace$

I can’t think of a good way to completely characterize the class of functions, for which the proof holds. Can you?

What does this mean geometrically? It means that as n increases, level surfaces of a function $f_n(x_1, ..., x_n)=(\sum_{j=1}^{k} |x|_{j}^{n} )^{1/n}$ defined in a k-dimensional space, will  start to resemble a k-dimensional cube, because a cube is a level surface of a function $f_\infty(x_1, ..., x_n)=max \lbrace |x_1|, ..., |x_k| \rbrace$. You might remember that we already spoke about this in my previous post titled “On Norms”.

Have you made a cool-looking but completely pointless animation to demonstrate this phenomenon? I thought you’d never ask:

(you may or may not need to click on it)

Does this fact have any useful applications? Not that I know of, which proves that this is some high quality math, and not same applied rubbish. (Just kidding, applied math can be beautiful too. Once in a thousand years.)

Posted in Math | 6 Comments

## One small step

Weird fact of the day: $arctan(x)+arctan(x^{-1})=sgn(x)\frac{\pi}{2}, x \neq 0$.

Why is it weird? Because the sum of two clearly non-constant functions is constant. Well, more or less so.

How to prove it? Differentiate.

How can I use it? You can confuse people with it: “Hello, clerk, I want $2arctan(e^{\pi})/\pi+2arctan(e^{-\pi})/\pi$” tickets to this movie.” Also, it works as a very fancy, if slightly incorrect way to write signum function.

Posted in Uncategorized | 2 Comments

## Another puzzle thing

This one is going to be more tricky.

Imagine a sphere with two linked handles. Is it homeomorphic to a sphere with two unlinked handles? Well, you know the classification theorem, so you know that the answer is “yes”. But can you actually imagine a gradual, continuous deformation that unlinks two handles? (If you do, you must have an extraordinarily good spatial imagination.)

Solution (it involves creepy, googly-eyed men with no legs):

Posted in Uncategorized | 2 Comments

## Puzzle thing

This puzzle is supposed to be classic, but for some reason I haven’t heard about it, and there is a possibility that neither have you. Find the length of the red line, using only school-level math:

This is one of those puzzles that seems difficult, but has a simple, retrospectively-obvious solution. Don’t worry, though: you will probably find the answer in a matter of minutes, if not faster. It’s pretty easy, as far as these things go.

A nifty trick I learned yesterday: suppose you have a big matrix with integer elements, like this one:

$M= \begin{bmatrix} 23 & 0 & 3 & 78 & 56\\ 12& 17 & 16& 20 & 100\\ 22& 14 & 111 & 1 & 15\\ 90& 32 & 54 & 29 & 12\\ 22 & 18 & 10 & 94 & 7\\ \end{bmatrix}$

and you want to prove that it’s invertible. What do you do? You consider the same matrix modulo 2:

$M \equiv \begin{bmatrix} 1 & 0 & 1 & 0 & 0\\ 0& 1 & 0& 0 & 0\\ 0& 0 & 1 & 1 & 1\\ 0& 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 0 & 1\\ \end{bmatrix} (mod 2)$

now you can see that it’s triangular and therefore $det(M) \equiv 1 (mod 2)$, so the determinant is definitely non-zero, and $M$ really is invertible. If it turned out that $det(M) \equiv 0 (mod 2)$, it wouldn’t imply that $M$ is degenerate, it would mean that the results are inconclusive, and you could, for example, try doing the same thing modulo 3, and so on.