0% found this document useful (0 votes)
92 views8 pages

Introduction To Recursion

Uploaded by

Kanika Arora
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views8 pages

Introduction To Recursion

Uploaded by

Kanika Arora
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Introduction to Recursion

The process in which a function calls itself directly or indirectly is called recursion and the
corresponding function is called a recursive function.
• A recursive algorithm takes one step toward solution and then recursively call itself to further
move. The algorithm stops once we reach the solution.
• Since called function may further call itself, this process might continue forever. So it is
essential to provide a base case to terminate this recursion process.
What is Recursion?
Recursion is the process in which a function calls itself again and again. It entails decomposing a
challenging issue into more manageable issues and then solving each one again. There must be a
terminating condition to stop such recursive calls. Recursion may also be called the alternative to
iteration. Recursion provides us with an elegant way to solve complex problems.
Recursive Function
A recursive function is a function that calls itself one or more times within its body. A recursive
function solves a particular problem by calling a copy of itself and solving smaller subproblems of the
original problems. Many more recursive calls can be generated as and when required. It is necessary
to have a terminating condition or a base case in recursion, otherwise, these calls may go endlessly
leading to an infinite loop of recursive calls and call stack overflow.
The recursive function uses the LIFO (LAST IN FIRST OUT) structure just like the stack data
structure. A recursion tree is a diagram of the function calls connected by pointed(up or down)
arrows to depict the order in which the calls were made.
Properties of Recursion
1. It solves a problem by breaking it down into smaller sub-problems, each of which can be
solved in the same way.
2. A recursive function must have a base case or stopping criteria to avoid infinite recursion.
3. Recursion involves calling the same function within itself, which leads to a call stack.
4. Recursive functions may be less efficient than iterative solutions in terms of memory and
performance.
Applications of Recursion
1. Recursive solutions are best suited to some problems like:
1. Tree Traversals: InOrder, PreOrder, PostOrder
2. Graph Traversals: DFS [Depth First Search] and BFS [Breadth First Search]
3. Tower of Hanoi
4. Backtracking Algorithms
5. Divide and Conquer Algorithms
6. Dynamic Programming Problems
7. Merge Sort, Quick Sort
8. Binary Search
9. Fibonacci Series, Factorial, etc.
2. Recursion is used in the design of compilers to parse and analyze programming languages.
3. Many computer graphics algorithms, such as fractals and the Mandelbrot set, use recursion
to generate complex patterns.
Advantages of Recursion
1. Clarity and simplicity: Recursion can make code more readable and easier to understand.
Recursive functions can be easier to read than iterative functions when solving certain types
of problems, such as those that involve tree or graph structures.
2. Reducing code duplication: Recursive functions can help reduce code duplication by
allowing a function to be defined once and called multiple times with different parameters.
3. Solving complex problems: Recursion can be a powerful technique for solving complex
problems, particularly those that involve dividing a problem into smaller subproblems.
4. Flexibility: Recursive functions can be more flexible than iterative functions because they can
handle inputs of varying sizes without needing to know the exact number of iterations
required.
Disadvantages of Recursion
1. Performance Overhead: Recursive algorithms may have a higher performance overhead
compared to iterative solutions. This is because each recursive call creates a new stack frame,
which takes up additional memory and CPU resources. Recursion may also cause stack
overflow errors if the recursion depth becomes too deep.
2. Difficult to Understand and Debug: Recursive algorithms can be difficult to understand and
debug because they rely on multiple function calls, which can make the code more complex
and harder to follow.
3. Memory Consumption: Recursive algorithms may consume a large amount of memory if the
recursion depth is very deep.
4. Limited Scalability: Recursive algorithms may not scale well for very large input sizes
because the recursion depth can become too deep and lead to performance and memory
issues.
5. Tail Recursion Optimization: In some programming languages, tail recursion optimization is
not supported, which means that tail recursive functions can be slow and may cause stack
overflow errors
What is a Recursive Algorithm?
A recursive algorithm is an algorithm that uses recursion to solve a problem. Recursive algorithms
typically have two parts:
1. Base case: Which is a condition that stops the recursion.
2. Recursive case: Which is a call to the function itself with a smaller version of the problem.
Types of Recursion
There are several different recursion types and terms. These include:
• Direct recursion: This is typified by the factorial implementation where the methods call
itself.
• In-Direct recursion: This happens where one method, say method A, calls another method B,
which then calls method A. This involves two or more methods that eventually create a
circular call sequence.
• Head recursion: The recursive call is made at the beginning of the method.
• Tail recursion: The recursive call is the last statement.
When to Use Recursion?
Recursion is a powerful technique that can be used to solve a wide variety of problems. However, it is
important to use recursion carefully, as it can lead to stack overflows if not used properly.
Recursion should be used when:
• The problem can be broken down into smaller subproblems that can be solved recursively.
• The base case is easy to identify.
• The recursive calls are tail recursive.
Examples of Recursion
Here are some common examples of recursion:
Example 1: Factorial: The factorial of a number n is the product of all the integers from 1 to n. The
factorial of n can be defined recursively as:
factorial(n) = n * factorial(n-1)
Example 2: Fibonacci sequence: The Fibonacci sequence is a sequence of numbers where each
number is the sum of the two preceding numbers. The Fibonacci sequence can be defined recursively
as:
fib(n) = fib(n-1) + fib(n-2)
Applications of Recursion Algorithms:
Here are some common applications of recursion:
• Tree and Graph Traversal: Depth-first search (DFS) and breadth-first search (BFS)
• Dynamic Programming: Solving optimization problems by breaking them into smaller
subproblems
• Divide-and-Conquer: Solving problems by dividing them into smaller parts, solving each part
recursively, and combining the results
• Backtracking: Exploring all possible solutions to a problem by recursively trying different
options
• Combinatorics: Counting or generating all possible combinations or permutations of a set.
Steps to Implement Recursion
• Step1 - Define a base case: Identify the simplest (or base) case for which the solution is
known or trivial. This is the stopping condition for the recursion, as it prevents the function
from infinitely calling itself.
Step2 - Define a recursive case: Define the problem in terms of smaller subproblems.
Break the problem down into smaller versions of itself, and call the function recursively to
solve each subproblem.
Step3 - Ensure the recursion terminates: Make sure that the recursive function eventually
reaches the base case, and does not enter an infinite loop.
Step4 - Combine the solutions: Combine the solutions of the subproblems to solve the
original problem.
Need of Recursion?
• Recursion helps in logic building. Recursive thinking helps in solving complex problems by
breaking them into smaller subproblems.
• Recursive solutions work as a a basis for Dynamic Programming and Divide and Conquer
algorithms.
• Certain problems can be solved quite easily using recursion like Towers of Hanoi
(TOH), Inorder/Preorder/Postorder Tree Traversals, DFS of Graph, etc.
What is the base condition in recursion?
A recursive program stops at a base condition. There can be more than one base conditions in a
recursion. In the above program, the base condition is when n = 1.
How a particular problem is solved using recursion?
The idea is to represent a problem in terms of one or more smaller problems, and add one or more
base conditions that stop the recursion.
Common Applications of Recursion
1. Tree and Graph Traversal: Used for systematically exploring nodes/vertices in data
structures like trees and graphs.
2. Sorting Algorithms: Algorithms like quicksort and merge sort divide data into subarrays, sort
them recursively, and merge them.
3. Divide-and-Conquer Algorithms: Algorithms like binary search break problems into smaller
subproblems using recursion.
4. Fractal Generation: Recursion helps generate fractal patterns, such as the Mandelbrot set, by
repeatedly applying a recursive formula.
5. Backtracking Algorithms: Used for problems requiring a sequence of decisions, where
recursion explores all possible paths and backtracks when needed.
6. Memoization: Involves caching results of recursive function calls to avoid recomputing
expensive subproblems.
Advantages of Recursive Algorithms
• Simplifies Complex Problems: Recursion can make solving complex problems easier by
breaking them down into smaller, manageable subproblems.
• Reduces Code Size: Recursive solutions often require fewer lines of code compared to
iterative solutions, making the code more concise and easier to read.
• Natural Fit for Certain Problems: Problems like tree traversals, factorial calculation, and
the Tower of Hanoi are naturally suited to recursion, leading to more intuitive solutions.
• Cleaner Code: Recursive algorithms can lead to cleaner, more elegant code, especially for
problems that involve repeated operations.
Disadvantages of Recursive Algorithms
• Higher Memory Usage: Recursion can use more memory due to the overhead of maintaining
multiple function calls on the call stack, which can lead to stack overflow if the recursion is
too deep.
• Slower Performance: Recursive algorithms may be slower than their iterative counterparts
due to the overhead of function calls and stack management.
• Risk of Infinite Recursion: If the base case is not defined or not reached, recursion can lead
to infinite loops and program crashes.
• Harder to Debug: Recursive algorithms can be more difficult to debug, as tracking the
sequence of function calls can be challenging, especially in complex recursion.
Applications of Recursive Algorithms
1. Sorting Algorithms
• Merge Sort: Uses recursion to divide the array into smaller subarrays and then merge them in
a sorted manner.
• Quick Sort: Recursively partitions the array around a pivot and sorts the partitions.
2. Search Algorithms
• Binary Search: Recursively divides a sorted array in half to locate a target element.
3. Mathematical Computations
• Factorial Calculation: Computes the factorial of a number by recursively multiplying it by
the factorial of the previous number.
• Fibonacci Sequence: Generates Fibonacci numbers using recursion by summing the previous
two numbers in the sequence.
4. Tree and Graph Traversals
• Tree Traversals (Preorder, Inorder, Postorder): Recursively visits nodes in a tree data
structure.
• Depth-First Search (DFS): Recursively explores all possible paths in a graph
before backtracking.
5. Dynamic Programming
• Memoization: Recursive algorithms with memoization store the results of subproblems to
avoid redundant calculations, such as in the computation of Fibonacci numbers or the
Knapsack problem.
6. Divide and Conquer Algorithms
• Strassen’s Matrix Multiplication: Recursively multiplies matrices by breaking them down
into smaller submatrices.
• Karatsuba Algorithm: Recursively multiplies large numbers by dividing them into smaller
parts.
Queue Data Structure
A Queue Data Structure is a fundamental concept in computer science used for storing and
managing data in a specific order.
• It follows the principle of "First in, First out" (FIFO), where the first element added to the
queue is the first one to be removed.
• It is used as a buffer in computer systems where we have speed mismatch between two
devices that communicate with each other. For example, CPU and keyboard and two devices
in a network.
• Queue is also used in Operating System algorithms like CPU Scheduling and Memory
Management, and many standard algorithms like Breadth First Search of Graph, Level Order
Traversal of a Tree.
Real-World Applications of Queues
1. Task Scheduling: Operating systems use queues to manage processes waiting for CPU time.
2. Print Spooling: Print jobs are queued and processed in the order they are received.
3. Web Servers: Requests are handled in a queue to ensure fairness and order.
4. Breadth-First Search (BFS): Queues are used in graph traversal algorithms like BFS.
5. Call Centers: Incoming calls are placed in a queue and answered in the order they are
received.

Issue of Queue:

You might also like