home | section main page


recursion

Table of Contents

1. Recursion is Recursion

Exactly as I say in the title.

1.1. but what is recursion?

Recursion.

1.2. No, seriously, what is it?

Self reference.

1.3. haha, very clever, it's not like that joke has been made 10 million times

Yeah, but I think it's a good introduction to the subject. You can think of recursion as self-assembly and it has deep connections to topics such as emergence. I will first describe it in a mathematics context, and then a programming context. For demonstration purposes, I will use my own programming language, Stem (warning: link takes you outside of mindmap). Again, stem is a prerequisite as it is the standard programming language in the mindmap.

2. Mathematics Describes Recursion

For this example, I will be using the factorial. One might define it like so:

\begin{align*} f: \mathbb{N}\rightarrow\mathbb{N}\ s.t. \\ f(0) = 1 \\ f(n) = nf(n - 1) \end{align*}

in other words, we want a function defined over natural numbers that is one when the input is zero, and otherwise multiplies the input with a copy of itself, only the input is one less. Let's try evaluating this function at \(x = 3\).

\begin{align*} f(3) = 3f(3 - 1) = 3f(2) \\ f(2) = 2f(1) \\ f(1) = 1f(0) \\ f(0) = 1 \end{align*}

once we substitute \(f(0) = 1\) in, you will see it all collapses.

\begin{align*} f(0) = 1 \\ f(1) = 1f(0) = 1 \times 1 = 1 \\ f(2) = 2f(1) = 2 \times 1 = 2 \\ f(3) = 3f(2) = 3 \times 2 = 6 \end{align*}

and so the result is multiplying \(3 \times 2 \times 1 \times 1 = 6\). If you observe what we did, you'll see that we started by trying to replace unknown variables by trying to evaluate \(f(x)\) one number down, and eventually we reach a "base case" – zero. As soon as the "base case" occurs, we then "go back up" by replacing all the unknown values with known ones – and that's how we evaluate recursive functions.

3. Programming Describe Recursion

In stem, a factorial implementation might look like this:

factorial [ dup 0 <= [ 1 + ] [ dup 1 - factorial * ] if ] def
5 factorial .

and in stem, we can print out the every step of the way with the builtin word ?:

factorial-debug [ dup 0 <= [ 1 + ] [ ? "\n" . dup 1 - factorial-debug ? "\n" . * ] if ] def
5 factorial-debug .

as you can see, the stack is slowly built up to have all of the numbers needed, and then when we reach the basecase (the base case being the condition that doesn't cause recursion in the if statement), in which case we "go back up" by multiplying and going back up the stack. This procedure of using a stack is present in all programming languages, although in stem the operations are transparent as the stack is accessible by regular program users. In short, we keep on going down and down until we hit the bottom, base case, in which case we have all the pieces we need in order to go back up again, where the stack stores the information from most recent tasks to be done and we work back up in order to do the less recent tasks.

This concept is important in programming because it allows one to build definitions in an intuitive way, simply by specifying the base case and specifying the case that is not the base case. Such an algorithm absolves oneself from having to design complicated patterns, as instead the entire computation emerges out of simple rules.

In general, we see recursive definitions and design patterns in nature in the form of fractals.

4. Self Reference Problems

A big part of infinite recursion has to do with self reference problems. For instance, Russel's paradox with respect to set theory: does a set that contains all sets that do not contain themselves contain itself?

Such a set would contain itself if and only if it didn't contain itself. This apparent contradiction in set theory is an example of using recursion to reach self reference paradoxes. There are more examples, such as Godel's theorems and Turing's computability theorem.

Copyright © 2024 Preston Pan