Python Programming

Lecture 14 Recursion, Sorting Algorithms

14.1 Recursion Examples

  • Example 1: The Greatest Common Divisor
  • We have learned a method to do it.

a = int(input('Enter your first number:'))
b = int(input('Enter your first number:'))
if a >= b:
    x = a
    y = b
    x = b
    y = a
while y!=0:
    r = y
    y = x%y
    x = r
  • Let's do it with recursion.

def gcd(a,b):
    if b==0:
        return a
    elif a < b:
        return gcd(b,a)
        return gcd(a-b, b)
  • We can make it simpler.

def gcd(a,b):
    if b==0:
        return a
        return gcd(b, a % b)
  • Example 2: Fibonacci Numbers
  • The Fibonacci Sequence is the series of numbers:
  • 1, 1, 2, 3, 5, 8, 13, 21, 34, ...

def fib(n)
    if n==0 or n==1:
        return 1
        return fib(n-1)+fib(n-2)
  • Example 3: Full Permutation
  • There is a simple way...

import itertools
for j in itertools.permutations([2,5,6]):
  • Let's do it with recursion.

def recursion_permutation(list, first, last):
    if first >= last:  
    for i in range(first, last):  
        list[i], list[first] = list[first], list[i]
        recursion_permutation(list, first+1, last)
        list[i], list[first] = list[first], list[i] 

  • Example 4: Hanoi
  • The Tower of Hanoi is a mathematical game or puzzle. It consists of three rods and a number of disks of different sizes, which can slide onto any rod. The puzzle starts with the disks in a neat stack in ascending order of size on one rod, the smallest at the top, thus making a conical shape.
  • The objective of the puzzle is to move the entire stack to another rod, obeying the following simple rules:
    • Only one disk can be moved at a time.
    • Each move consists of taking the upper disk from one of the stacks and placing it on top of another stack or on an empty rod.
    • No larger disk may be placed on top of a smaller disk.

def moveTower(height, fromPole, toPole, withPole):
    if height>=1:

def moveDisk(fp,tp):
    print("moving disk from %s to %s\n" %(fp,tp))

14.2 Sorting Algorithms

  • Sorting is the process of placing elements from a collection in some kind of order. There are many, many sorting algorithms that have been developed and analyzed. This suggests that sorting is an important area of study in computer science.
    • Bubble Sort
    • Selection Sort
    • Insertion Sort
    • Merge Sort
    • Quick Sort
1. Bubble sort
  • The bubble sort makes multiple passes through a list. It compares adjacent items and exchanges those that are out of order. Each pass through the list places the next largest value in its proper place. In essence, each item “bubbles” up to the location where it belongs.
  • If there are $n$ items in the list, then there are $n$ − 1 pairs of items that need to be compared on the first pass. It is important to note that once the largest value in the list is part of a pair, it will continually be moved along until the pass is complete.
  • At the start of the second pass, the largest value is now in place. There are $n$ − 1 items left to sort, meaning that there will be $n$ − 2 pairs. Since each pass places the next largest value in place, the total number of passes necessary will be $n$ − 1. After completing the $n$ − 1 passes, the smallest item must be in the correct position with no further processing required.

def bubble_sort(a_list):
    for pass_num in range(len(a_list) - 1, 0, -1):
        for i in range(pass_num):
            if a_list[i] > a_list[i + 1]:
                temp = a_list[i]
                a_list[i] = a_list[i + 1]
                a_list[i + 1] = temp
a_list = [54, 26, 93, 17, 77, 31, 44, 55, 20]
2. Selection sort
  • The selection sort improves on the bubble sort by making only one exchange for every pass through the list.
  • As with a bubble sort, after the first pass, the largest item is in the correct place. After the second pass, the next largest is in place. This process continues and requires $n$ − 1 passes to sort $n$ items, since the final item must be in place after the ($n$ − 1)st pass.
  • Suppose you have a bunch of music on your computer. For each artist, you have a play count.

  • One way is to go through the list and find the most-played artist. Add that artist to a new list.

  • To find the artist with the highest play count, you have to check each item in the list. This takes $O(n)$ time, as you just saw. So you have an operation that takes $O(n)$ time, and you have to do that n times:

  • This takes $O(n × n)$ time or $O(n^2)$ time.

def selection_sort(a_list):
    for fill_slot in range(len(a_list) - 1, 0, -1):
        pos_of_max = 0
        for location in range(1, fill_slot + 1):
            if a_list[location] > a_list[pos_of_max]:
                pos_of_max = location

        temp = a_list[fill_slot]
        a_list[fill_slot] = a_list[pos_of_max]
        a_list[pos_of_max] = temp
3. Insertion Sort

def insertion_sort(a_list):
    for index in range(1, len(a_list)):
        current_value = a_list[index]
        position = index
        while position > 0 and a_list[position - 1] > current_value:
            a_list[position] = a_list[position - 1]
            position = position - 1
        a_list[position] = current_value

14.3 Sorting with Recursion

4. Merge Sort
  • Splitting the List in a Merge Sort
  • Merge Together
  • Splitting the List in a Merge Sort

def merge_sort(a_list):
    print("Splitting ", a_list)
    if len(a_list) > 1:
        mid = len(a_list) // 2
        left_half = a_list[:mid]
        right_half = a_list[mid:]

        i = 0
        j = 0
        k = 0
  • Merge Together

        while i < len(left_half) and j < len(right_half):
            if left_half[i] < right_half[j]:
                a_list[k] = left_half[i]
                i = i + 1
                a_list[k] = right_half[j]
                j = j + 1
            k = k + 1

        while i < len(left_half):
            a_list[k] = left_half[i]
            i = i + 1
            k = k + 1

        while j < len(right_half):
            a_list[k] = right_half[j]
            j = j + 1
            k = k + 1
    print("Merging ", a_list)
  • In order to analyze the merge_sort function, we need to consider the two distinct processes that make up its implementation.
  • The result of this analysis is that log $n$ splits, each of which costs $n$ for a total of $n$ log $n$ operations. A merge sort is an $O(n log n)$ algorithm.
5. Quick Sort
  • Base case

  • An array with two elements is pretty easy to sort, too.

  • What about an array of three elements?

  • We use D&C to solve this problem. Let's pick a pivot at first, say, 33.

  • This is called partitioning. Now you have:

    A sub-array of all the numbers less than the pivot

    The pivot

    A sub-array of all the numbers greater than the pivot

  • If the sub-arrays are sorted, then you can combine the whole thing like this—left array + pivot + right array—and you get a sorted array.

  • Suppose you have this array of five elements.

  • For example, suppose you pick 3 as the pivot. You call quicksort on the sub-arrays.

def quick_sort(a_list):
    quick_sort_helper(a_list, 0, len(a_list) - 1)
def quick_sort_helper(a_list, first, last):
    if first < last:
        split_point = partition(a_list, first, last)
        quick_sort_helper(a_list, first, split_point - 1)
        quick_sort_helper(a_list, split_point + 1, last)

def partition(a_list, first, last):
    pivot_value = a_list[first]
    left_mark = first + 1
    right_mark = last
    done = False
    while not done:
        while left_mark <= right_mark and a_list[left_mark] <= pivot_value:
            left_mark = left_mark + 1

        while left_mark <= right_mark and a_list[right_mark] >= pivot_value:
            right_mark = right_mark - 1

        if right_mark < left_mark:
            done = True
            temp = a_list[left_mark]
            a_list[left_mark] = a_list[right_mark]
            a_list[right_mark] = temp

    temp = a_list[first]
    a_list[first] = a_list[right_mark]
    a_list[right_mark] = temp

    return right_mark
  • Quicksort is unique because its speed depends on the pivot you choose.

  • Actually, the big O of the quick sort algorithm depends on the pivot you pick.

  • In the best case, the big O of quick sort is $O(nlogn)$. However, in the worst case, the big O of it turns to be $O(n^2)$.

  • Why?

  • Worst Case

  • Best Case

  • The average case is the best case, if you pick pivot randomly.

  • A variant of the Insertion Sort: Shell Sort
  • The sorting algorithm in Python: Timsort
  • Timsort is a hybrid stable sorting algorithm, derived from merge sort and insertion sort, designed to perform well on many kinds of real-world data.


  • Recursion Examples
  • Sorting Algorithms