Communication - Shortest Path Algorithms.
Cryptography - Number-Theoretic Algorithms.
Computer Graphics- Geometric Algorithms.
Database Indices - Balanced Search tree data structures.
Computational biology - Dynamic Programming.
-----------------------------------------------------------------------------------------------------------
KARATSUBA MULTIPLICATION.
The way we calculate time in algorithms is not by how they perform, but by how they scale with larger inputs.
Randomized Algorithms- Difference outputs over same inputs.
----------------------------------------------------------------------------------------------------------
MERGESORT ALgorithm:
Claim : Merge sort takes <= 6nlg(n) + 6n.
Proof: No of work done in each level = 6n
No of levels = lg(n) + 1.
Multiply.
Imagine Log as lg(n) as the number of times we need to divide n by the base(in this case, 2) to reach 1.
Recursive Tree Method.
Asymptotic Analysis : Supress constant factors and lower-order terms.
Formal Definition : T(n) < = cf(n) where n >= n0. n0 represesnts sufficiently large and c represents the constant.
------------------------------------------------------------------------------------------------------------
DIVIDE AND CONQUER PARADIGM:
Inversions in an array : ia[j];
Used to see how different two arrays are from each other based on ranking.
Inversion Algo:
sort_and_count( Array A, int n){
if (n==1)
return 0;
else
int x= sort_and_count(first half of A, n/2);
int y=sort_and_count(2nd half of A, n/2);
int z= merge_and_countSplitInv(A,n);
return x+y+z;
NOTE: If there are no split inversions, every element in left array B is smaller than every element in right array C.
The split inversions involving an element Y in C is exactly the number of elements left in B when Y is copied to output array D.
Hence we piggyback on mergeSort for inversion, hence running time is same as merge sort.
---------------------------------------------------------------------------------------------------------------------------------
STRASSEN'S MATRIX MULTIPLICATION METHOD:
Have done it in class before. Check online. Very Straightforward.
---------------------------------------------------------------------------------------------------------------------------------
QUICKSORT:
Pick pivot. All left of pivot should be less than the pivot, same for the right.
Partitioning can be done in O(n) time.
Reduces problem size. Doesn't need extra memory.
PARTITION SUBROUTINE:
If not element, chose another element and swap with the first element for the pivot.
Divide into two- what we have seen and what we haven't seen. First part will be partitioned the right way. ie: Everything smaller than the pivot will be left of the pivot.
j - Boundary point between what we have seen and what we haven't.
i - Boundary between less than pivot and more than pivot. Hence, it points to the last element smaller than pivot.
QUICKSORT (Array, N){
if n=1 return;
P = choose_Pivot(A,n);
Partition A around p;
Recursively call first part;
Recursively call second part.
}
Running time depends on how well the pivot element is chosen.
NOTE: If a sorted array is passed to QuickSort, the running time will be O(N^2).
For every array size n for QuickSort, the average running time will be O(NlogN).
Best case would be to choose the median everytime, but a 25-75 split is good enough to get O(nlogn) time.
-----------------------------------------------------------
MASTER THEOREM:
Recursive Algo:
Base case, Work done in recursive call, work done outside the recursive call.
Assumptions :
All subproblems have the same size.
n is a power of b.
Master Format: T(N) <= aT(N/b) + O(N^d) or T(N) <= aT(N/b) + cN^d.
T(N) <= cN^d[a/b^d]^j. (Use merge sort to come to this conclusion) ie:Total work <= cn^d(Work outside recursion) * (sigma from 0 to logba)(a/b^d)^j.
a^(log(base b)N) = No of leaves in a recursion tree.