Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Communication - Shortest Path Algorithms.

Cryptography - Number-Theoretic Algorithms.

Computer Graphics-  Geometric Algorithms.

Database Indices - Balanced Search tree data structures.

Computational biology - Dynamic Programming.

-----------------------------------------------------------------------------------------------------------

KARATSUBA MULTIPLICATION.

  • Divide Number into two  parts, first part greater by 1 if no of digits is odd.
  • Ie: x*y: x=10^(n/2)a+b, y=10^(n/2)c+d.
  • Therefore: x*y = 10^(n)(ac) + 10^(n/2)(ab+bc) + bd.
  • Three recursive calls. For the middle call, compute recursively (a+b)(c+d) and subtract ac and bd computed earlier.

The way we calculate time in algorithms is not by how they perform, but by how they scale with larger inputs.

Randomized Algorithms- Difference outputs over same inputs.

----------------------------------------------------------------------------------------------------------

MERGESORT ALgorithm:

  • Two recursive calls to first and second half of array to sort each.
  • Merge both arrays in ascending order.

Claim : Merge sort takes <= 6nlg(n) + 6n.

Proof: No of work done in each level = 6n

No of levels = lg(n) + 1.

Multiply.

Imagine Log as lg(n) as the number of times we need to divide n by the base(in this case, 2) to reach 1.

 

Recursive Tree Method.

  • Hence: Total depth of recursion tree in merge sort : lg(n)+1.
  • No. of subproblems existing at level j= 2^j.
  • Array size at level j = n/2^j.

Asymptotic Analysis : Supress constant factors and lower-order terms.

Formal Definition :  T(n) < = cf(n) where n >= n0. n0 represesnts sufficiently large and c represents the constant.

------------------------------------------------------------------------------------------------------------

DIVIDE AND CONQUER PARADIGM:

  • Divide.
  • Conquervia recursive calls.
  • Combine solutions of subproblems

Inversions in an array : ia[j];

Used to see how different two arrays are from each other based on ranking.

Inversion Algo:

sort_and_count( Array A, int n){

if (n==1)

return 0;

else

int x= sort_and_count(first half of A, n/2);

int y=sort_and_count(2nd half of A, n/2);

int z= merge_and_countSplitInv(A,n);

return x+y+z; 

NOTE: If there are no split inversions, every element in left array B is smaller than every element in right array C.

The split inversions involving an element Y in C is exactly the number of elements left in B when Y is copied to output array D.

Hence we piggyback on mergeSort for inversion, hence running time is same as merge sort.

---------------------------------------------------------------------------------------------------------------------------------

STRASSEN'S MATRIX MULTIPLICATION METHOD:

Have done it in class before. Check online. Very Straightforward.

---------------------------------------------------------------------------------------------------------------------------------

 

QUICKSORT:

Pick pivot. All left of pivot should be less than the pivot, same for the right.

Partitioning can be done in O(n) time.
Reduces problem size. Doesn't need extra memory.

  • Choose pivot element.
  • Rearrange such that everything left of it is smaller.

PARTITION SUBROUTINE:

If not element, chose another element and swap with the first element for the pivot.

Divide into two- what we have seen and what we haven't seen. First part will be partitioned the right way. ie: Everything smaller than the pivot will be left of the pivot.

j - Boundary point between what we have seen and what we haven't.
i - Boundary between less than pivot and more than pivot. Hence, it points to the last element smaller than pivot.

QUICKSORT (Array, N){
if n=1 return;
P = choose_Pivot(A,n);
Partition A around p;
Recursively call first part;
Recursively call second part.
}

Running time depends on how well the pivot element is chosen. 

NOTE: If a sorted array is passed to QuickSort, the running time will be O(N^2).

For every array size n for QuickSort, the average running time will be O(NlogN).

Best case would be to choose the median everytime, but a 25-75 split is good enough to get O(nlogn) time.  

-----------------------------------------------------------

MASTER THEOREM:

Recursive Algo:
Base case, Work done in recursive call, work done outside the recursive call.

Assumptions :

All subproblems have the same size.
n is a power of b.

 

Master Format: T(N) <= aT(N/b) + O(N^d) or T(N) <= aT(N/b) + cN^d.

  • a-> No of recursive calls. or RATE OF SUBPROBLEM PROLIFERATION (RSP). (Evil)
  • b-> input size shrinkage factor 
  • d->Exponent of work done outside recursive calls.
  • b^d -> RATE OF WORK SHRINKAGE (RWS). (Good)

T(N) <= cN^d[a/b^d]^j. (Use merge sort to come to this conclusion) ie:Total work <= cn^d(Work outside recursion) * (sigma from 0 to logba)(a/b^d)^j.

  • RSP = RWS : Same amount of work done at each level. Like merge sort. Hence, expect O(nlogn).(Because in the equation, a/b^d =1)
  • RSP < RWS: Less and less work with every level. Hence, max work is done at root. Hence, we consider only root time as it will dominate over the rest. Expect O(n^d).
  • RSP > RWS: More work done with each level. hence, max work is at leaves. Then leaves time will dominate. Hence, consider only that. Depth of leaves will be O(log(base b)N).

a^(log(base b)N) = No of leaves in a recursion tree.

 

 

MergeSort & Karatsuba

Rahul Guin
Module by Rahul Guin, updated more than 1 year ago
No tags specified