In a previous article we saw that writing the HALT program is a logical impossibility, but what about more straightforward calculating tasks? Our simple computing model has stood up to the challenge of the hyper-exponential growth generated by the ultra-recursive Ackermann function. But how will it go playing the Busy Beaver game, invented by Tibor Radó in 1962?
The Busy Beaver game
To understand the Busy Beaver game we must return to our idealised model of computation, and add a few more details. Firstly, we have to specify the nature of the input and output of our computing machine. We will use an infinite tape of discrete cells, each of which contains either a 0 or a 1, and each able to be read by a program and overwritten. Secondly, we will model a program as a set of cards, operating on a currently active cell. Each card consists of two rules, one to apply if the active cell contains a 0, and another if it contains a 1. The rules tell the machine to write a new symbol into the cell (it may be the same symbol as is currently there), make either the next left or next right cell the new active cell, and then load a new card (or HALT). The size of the program, or equivalently the size of the machine, is simply the number of cards.
This model might seem too simple, but it turns out that any calculation a more complex machine can perform can also be computed under this simple model. In fact, any computer program you can think of can be implemented on such a machine, which, by the way, is called a Turing machine.
Representing the rules as a triple (new symbol, shift L or R, new card), here is an example two-card machine.
$$ \begin{tabular}{ll}\mbox{Card 1:} & 0 \rightarrow 1,R,2 \\ & 1 \rightarrow 1,L,2 \\ \mbox{Card 2: } & 0 \rightarrow 1,L,1 \\ & 1 \rightarrow 1,R,HALT \\ \end{tabular} $$Starting with card 1 and a tape initialised with 0 in every position, running this machine proceeds as in the following table.
Card | Symbol | Write | MoveNext card | Tape | (New active cell in bold) |
1 | 0 | 1 | R | 2 | ...0001000... |
2 | 0 | 1 | L | 1 | ...0001100... |
1 | 1 | 1 | L | 2 | ...0001100... |
2 | 0 | 1 | L | 1 | ...0011100... |
1 | 0 | 1 | R | 2 | ...111100... |
2 | 1 | 1 | R | HALT | ...0111100... |
We see that this machine runs through six card transitions (steps), writes four 1s onto the tape, and then halts. A different machine with different instructions written on the two cards might of course run on for longer than six steps. This raises a question: out of all the halting machines running a two-card program and set to start on a tape with all 0s, which one runs the longest?
Radó calls this busiest machine a Busy Beaver. You can also define a busy beaver for machines running with three cards, four cards, or any number $n$ of cards. The process of finding the maximum number $BB(n)$ of steps executed by a busy beaver of size $n$ is called the Busy Beaver Game. The first couple of $BB$ numbers are easy to determine. For $0$ cards there can be no steps, hence $BB(0) = 0$. For $1$ card, all that can be done is an immediate transition to the HALT state, hence $BB(1) = 1$. It is far from obvious, but there is no two card machine busier than our example above, and so $BB(2) = 6$. So far these numbers do not look particularly exciting. It took some years to prove, but the next couple of values are $BB(3) = 21$, and $BB(4) = 107$. But, to this day, no other values of the $BB(n)$ function are known. The busiest five card machine found so far isA | B | C | D | E | |
0 | 1RB | 1RC | 1RD | 1LA | 1RH |
1 | 1LC | 1RB | 0LE | 1LD | 0LA |
A | B | C | D | E | F | |
0 | 1RB | 1RC | 1LD | 1RE | 1LA | 1LH |
1 | 1LE | 1RF | 0RB | 0LC | 0RD | 1RC |
It is clear that the Busy Beaver function grows very rapidly. Indeed, the Busy Beaver candidate machines are similar to the extremely rapidly growing functions we considered in part II of this article—they implement the calculation of some super-exponential function $f(n)$, needing only a few states for small $n$. Writing Graham's number using Knuth's up-arrow notation or Ackermann's function (see part II of this article) requires many lines. Yet it is believed that $BB(23)$ is greater than Graham's number, so the Busy Beaver function appears to grow much faster than Ackermann's function. Can any computer calculate the values $BB(n)$ for all $n$?
The crucial insight comes from assuming the answer is yes, and supposing we have the function $BB(n)$. We could then implement the \textbf{HALT} function as: $$\ \mbox{\textbf{HALT}(program,input)}=\left \{ \begin{tabular}{l}\mbox{NO, if the program (with size $n$) runs longer than $BB(n)$}, \\ \mbox{YES, otherwise. } \end{tabular} $$ That is, if we can calculate $BB(n)$, we can solve the halting problem by converting the input program to a machine of the required type and determining its size $n$, calculating $BB(n)$ and running the machine. If it runs more than $BB(n)$ steps, then, by definition, it must run forever. But we've already determined that it is logically impossible to solve the halting problem, hence it is similarly logically impossible to write a program that calculates $BB(n)$ for all $n.$ The Busy Beaver function, $BB(n)$, despite being well-defined and indeed quite simple to describe, is intrinsically non-computable. In fact, it grows faster than any computable function, since otherwise we could use the faster growing, computable function to calculate an upper bound on $BB(n)$ that we could then use in the implementation of HALT above. It's not just a matter of running time and storage space, but it defies the very notion of computability.The Busy Beaver numbers are interesting and important tools for studying the nature and limits of computation, but a further use of more general appeal is wonderfully described by Scott Aaronson in Who can name the biggest number?—if you want to win at Who can name the biggest number? the Busy Beaver numbers are hard to go past.
About the author
Ken Wessen has PhDs in Theoretical Physics and Human Biology, and a Graduate Diploma in Secondary Mathematics Education. He has taught both at university and in high schools, but most of his working life has been in investment banking, developing algorithms for automated trading of equities and foreign exchange. In his spare time he writes free software and develops online activities for mathematics learning at The Mathenaeum.