Written by Colin+ in Uncategorized.

*For 2019, I’m trying an experiment: every couple of weeks, writing a post about a mathematical object that a) I don’t know much about and b) is named after somebody. These posts are a trial run – let me know how you find them!*

The chief use of the Ackermann function, these days, is to show you can generate REALLY BIG numbers from small arguments. However, it plays an important role in computing theory: it was one of the first, and simplest, non-primitive-recursive functions.

A *primitive recursive* function is defined using compositions and primitive recursion – which, in itself, sounds rather recursive. However, primitive recursion has a technical definition, relying on five axioms:

- The constant function $0$ (taking no arguments) is primitive recursive.
- The one-argument successor function $S(x) = x+1$ is primitive recursive.
- The projection function $P_i^n(x_1, x_2, … x_n) = x_i$ is primitive recursive.
- If a primitive recursive function $f$ takes $k$ arguments, and the primitive recursive functions $g_1, g_2, … g_k$ each take $m$ arguments, then their composition $f(g_1(x_1,…x_m), g_2(x_1,…x_m), … g_k(x_1,…x_m))$ is primitive recursive.
- If $f$ is primitive recursive and takes $k$ arguments, and $g$ is primitive recursive and takes $k+2$ arguments, then the function $h$ (taking $k+1$ arguments) is the primitive recursion of $f$ and $g$ if:

- $h(0, x_1, … x_k) = f(x_1, … x_k)$ and;
- $h(y+1, x_1, … x_k) = g(y, h(y, x_1, … x_k), x_1, … x_k)$

OK: primitive recusrive functions are ones that can be made by:

- adding 1 to a number
- repeating a function a fixed number of times (like a for loop)
- shuffling the arguments, or
- composing such functions with each other.

And what’s the point? Well, primitive recursive functions are guaranteed to *halt* – they can never get into an infinite loop. (If the restriction on how many times the function can be called is lifted, things become *Turing complete* – and you can’t tell whether an arbitrary Turing complete function will terminate or not.)

The Ackermann function (or, in this form, the Ackermann-Péter function), is given by:

- $A(0,n) = n+1$
- $A(m,0) = A(m-1, 1)$
- $A(m,n) = A(m-1, A(m, n-1) )$ if $m$ and $n$ are both positive.

That doesn’t seem too outrageous: for example, we can work out $A(1,1)$ without too much trouble:

Using the last definition, $A(1,1) = A(0, A(1,0) )$.

We need to work out $A(1,0)$, using the second definition, which gives $A(1,0) = A(0,1) = 2$ (using the first definition).

So, we have $A(1,1) = A(0, 2) = 3$. That’s not a big number!

However, as $m$ and $n$ increase, the Ackermann function increases dramatically – $A(4,2)$, for example, is around $2^{65536}$, or nearly $10^{20000}$.

It does, however, always terminate – each $A(m,n)$ is defined either on its own terms (if $m=0$) or in terms of "smaller" Ackermann functions – any functions it relies on have at least one smaller argument, and no bigger ones.

I’m writing this with a raging headache and, I have to concede, a flimsy understanding of the details of the proof. My understanding of it goes as follows:

For every primitive recursive function $f$, there is an integer, $t$, such that $f(x_1,…, x_n) < A(t, \max(\{x_1,…,x_n\}))$.

This means that $A$ cannot be primitive recursive, or else $A(t,t) < A(t,t)$ – which is a contradiction. In short, $A$ grows more quickly than any primitive recursive function possibly can!

The Ackermann function is a lovely example (to me) of complexity developing from very simple rules – but its mathematical importance comes from showing you can have a function that’s guaranteed to halt, but that isn’t primitive recursive.