In mathematics and computer science, recursion is a common term. Recursion is a process that is in principle infinite and contains itself as part or can be defined with the help of itself. Usually, recursive processes can be described relatively briefly or can be triggered by a relatively short statement. In recursion, the successive sub-processes or the objects created one after the other are not independent of each other, but between each pair of steps or pairs of objects there is a special, the recursive relationship.
The term is very broad. In nature, this is a frequently observable process (e.g. plant growth). Recursion is also a problem-solving strategy. Complex issues can often be captured very elegantly with recursively formulated rules. The basic principle is then to reduce a general task to a simpler task of the same class. This is also used in so-called recursive programming: In order for recursion to occur, a procedure, function or method only has to call itself. This process continues until a termination condition included in the program takes effect.
Types of Recursion
- The most common form of recursion is linear recursion, in which no more than one recursive call may occur in each case of the recursive definition. The computation then runs along a chain of calls. In such a recursion, the call tree does not contain any branches.
- Primitive recursion is a special case of linear recursion that can always be replaced by an iteration. Here, functions are defined on the natural numbers, with each recursive call decreasing or increasing its first parameter by one. Any primitive recursive definition can be replaced by a loop (e.g., for loop or while loop) using a stack.
- Tail recursion is the special case of linear recursion, in which each recursive call is the last action of the recursive call. End recursions can be replaced by while loops, and vice versa.
- Nested recursion is a recursion in which recursive calls occur in parameter expressions of recursive calls. This form of recursion is considered to be extraordinarily difficult to understand.
- Cascade Recursion refers to the case in which several recursive calls are next to each other. The recursive calls then form a tree. Cascading recursion is considered elegant, but without further action it can entail an exponential computational effort. It is often used as a starting point for deriving another, more efficient formulation.
- Reciprocal recursion refers to the definition of multiple functions by mutual use of each other. It can be traced back to the ordinary recursion of a tuple-valued function.
Recursion in Programming
Higher-level programming languages that work with functions usually also allow recursion. In most cases, solutions can be specified recursively or iteratively. Recursion and iteration are essentially equally powerful approaches. The same or similar processes are repeated several times, the difference lies in the algorithm used.
In an iteration, the multi-part command is to loop (for, while…) multiple times until a cancel condition is met. In the case of a recursion, it is sufficient to add a request to the procedures or functions that they must be reapplied with a regularly changed parameter until a cancellation condition is met.
A recursion usually requires less source code and is (for experienced users) clearer – no helper variables and loop counters need to be defined. Iterative processes are usually more efficient and require less storage space. The reason for this is that the repeated function calls with all cached values are stored on the stack. In particular, recursion can also cause a stack overflow. When programming real-time systems on microcontrollers, recursion is therefore often dispensed with.
Some programming languages (e.g. in functional programming) do not allow iteration, so that the recursive implementation must always be chosen. Such languages often use primitive recursions for optimization, which are implemented internally as iterations (some interpreters for LISP and Scheme do this).
It should be noted that a naïve implementation of some functions requires partial solutions to be computed multiple times. In this example, the remedy is memorization, which is based on the reuse of intermediate solutions that have already been calculated. Recursion is an essential part of some design strategies for efficient algorithms, especially the divide and conquer strategy. Other approaches (e.g. so-called greedy algorithms) require an iterative approach. Recursion and primitive-recursive functions play a major role in theoretical computer science, especially in complexity theory and computability theory (see also lambda calculus and Ackermann function). In compiler construction, recursive descent is a technique in which a language is parsed recursively.