I learned a little bit about sorting in csc108. I just
learned three easy ways of sorting, bubble sort, insert sort and select sort. They
are very easy to understand, and all of them do not use recursion. For example,
def select(L):
"""Produce copy of L in sorted order by selection sort
>>> select([3,
1, 2])
[1, 2, 3]
"""
L1 = L[:]
for i in range(len(L1)):
m = min(L1[i:])
j = L1.index(m,i)
L1[i], L1[j] =
L1[j], L1[i]
return L1
this is the code for select sort. Choose the minimum number
in the list from index i to the end, swap that minimum number with the first
number at index i. by using for loop, we can get a sorted list.
In csc148, I learned some new ways of sorting and knew how
to determine the time complexity. the time complexity is commonly expressed
using big O notation. It is related to the length of the list. The average time
complexity for select, insert and bubble sort is O(n^2). Quick sort and merge
sort are two new ways of sorting. For example,
def quick(L):
if len(L) > 1:
pivot = L[0] # there are much better ways of choosing the pivot!
smaller_than_pivot = [x for x in L[1:] if x < pivot]
larger_than_pivot = [x for x in L[1:] if x >= pivot]
return quick(smaller_than_pivot) + [pivot] + quick(larger_than_pivot)
else:
return L
we divide a list in three parts (smaller than pivot, pivot
and larger than pivot). Then use recursion for the parts smaller than pivot and
larger than pivot. The time complexity can be O(n^2). The possible case can be
that every number in the list is same. Merge
sort is a little bit different from quick sort. Firstly, we need to implement a
function to put two lists together. And we also need a function to divide the
list into 2 same length lists. The complexity for merge sort is same with quick
sort. Merge sort and quick sort both use recursion. They both divide the list
into parts than put them together. Every way of sorting has its own features. When
we solve some computer problem, everyone will write different code.