Introduction to Algorithms Lecture 1 Prof. Dr. Aydin Ozturk Introduction • The methods of algorithm design form one of the core practical technologies of computer science. • The main aim of this lecture is to familiarize the student with the framework we shall use through the course about the design and analysis of algorithms. • We start with a discussion of the algorithms needed to solve computational problems. The problem of sorting is used as a running example. • We introduce a pseudocode to show how we shall specify the algorithms. Algorithms • The word algorithm comes from the name of a Persian mathematician Abu Ja’far Mohammed ibn-i Musa al Khowarizmi. • In computer science, this word refers to a special method useable by a computer for solution of a problem. The statement of the problem specifies in general terms the desired input/output relationship. • For example, sorting a given sequence of numbers into nondecreasing order provides fertile ground for introducing many standard design techniques and analysis tools. The problem of sorting Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Example of Insertion Sort Analysis of algorithms The theoretical study of computer-program performance and resource usage. What’s more important than performance? • modularity • correctness • maintainability • functionality • robustness • user-friendliness • programmer time • simplicity • extensibility • reliability Analysis of algorithms Why study algorithms and performance? • Algorithms help us to understand scalability. • Performance often draws the line between what is feasible and what is impossible. • Algorithmic mathematics provides a language for talking about program behavior. • The lessons of program performance generalize to other computing resources. • Speed is fun! Running Time • The running time depends on the input: an already sorted sequence is easier to sort. • Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones. • Generally, we seek upper bounds on the running time, because everybody likes a guarantee. Kinds of analyses Worst-case: (usually) • T(n) = maximum time of algorithm on any input of size n. Average-case: (sometimes) • T(n) = expected time of algorithm over all inputs of size n. • Need assumption of statistical distribution of inputs. Best-case: • Cheat with a slow algorithm that works fast on some input. Machine-independent time The RAM Model Machine independent algorithm design depends on a hypothetical computer called Random Acces Machine (RAM). Assumptions: • Each simple operation such as +,-,if ...etc takes exactly one time step. • Loops and subroutines are not considered simple operations. • Each memory acces takes exactly one time step. Machine-independent time What is insertion sort’s worst-case time? • It depends on the speed of our computer, • relative speed (on the same machine), • absolute speed (on different machines). BIG IDEA: • Ignore machine-dependent constants. • Look at growth of T (n) as n “Asymptotic Analysis” Machine-independent time: An example A pseudocode for insertion sort ( INSERTION SORT ). INSERTION-SORT(A) 1 for j 2 to length [A] 2 do key A[ j] 3 Insert A[j] into the sortted sequence A[1,..., j-1]. 4 i j–1 5 while i > 0 and A[i] > key 6 do A[i+1] A[i] 7 ii–1 8 A[i +1] key Analysis of INSERTION-SORT(contd.) INSERTION - SORT(A) cost 1 for j 2 to length[ A] c1 n c2 n 1 0 n 1 2 3 do key A[ j ] times Insert A[ j ] into the sorted sequence A[1 j 1] 4 i j 1 c4 n 1 5 while i 0 and A[i ] key c5 c6 c7 c8 n 1 6 7 8 do A[i 1] A[i ] i i 1 A[i 1] key n t j 2 j n (t j 2 j n (t j 2 j 1) 1) Analysis of INSERTION-SORT(contd.) The total running time is n n j 2 j 2 T (n) c1 c2 (n 1) c4 (n 1) c5 t j c6 (t j 1) n c7 (t j 1) c8 (n 1). j 2 Analysis of INSERTION-SORT(contd.) The best case: The array is already sorted. (tj =1 for j=2,3, ...,n) T (n) c1n c2 (n 1) c4 (n 1) c5 (n 1) c8 (n 1) (c1 c2 c4 c5 c8 )n (c2 c4 c5 c8 ). Analysis of INSERTION-SORT(contd.) •The worst case: The array is reverse sorted (tj =j for j=2,3, ...,n). n(n 1) j j 1 2 n T (n) c1n c2 (n 1) c5 (n(n 1) / 2 1) c6 (n(n 1) / 2) c7 (n(n 1) / 2) c8 (n 1) (c5 / 2 c6 / 2 c7 / 2)n 2 (c1 c2 c4 c5 / 2 c6 / 2 c7 / 2 c8 )n T (n) an2 bn c Growth of Functions Although we can sometimes determine the exact running time of an algorithm, the extra precision is not usually worth the effort of computing it. For large inputs, the multiplicative constants and lower order terms of an exact running time are dominated by the effects of the input size itself. Asymptotic Notation The notation we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers N 0, 1, 2, ... O-notation • For a given function g (n) , we denote by O ( g (n)) the set of functions f (n) : there exist positive constants c and n0 s.t. O( g (n)) 0 f ( n ) cg ( n ) for all n n 0 • We use O-notation to give an asymptotic upper bound of a function, to within a constant factor. • f (n) O( g (n)) means that there existes some constant c s.t. f (n) is always cg(n) for large enough n. Ω-notation • For a given function set of functions g (n), we denote by ( g (n)) the f (n) : there exist positive constants c and n0 s.t. ( g (n)) 0 cg (n) f (n) for all n n0 • We use O-notation to give an asymptotic lower bound on a function, to within a constant factor. • f (n) ( g (n)) means that there exists some constant c s.t. f (n) is always cg (n) for large enough n. Θ -notation • For a given function g (n), we denote by ( g (n)) the set of functions f (n) : there exist positive constants c1 , c2 , and n0 s.t. ( g (n)) 0 c1 g (n) f (n) c2 g (n) for all n n0 • A function f (n) belongs to the set ( g (n)) if there exist positive constants c1 and c2 such that it can be “sandwiched” between c1 g (n) and c2 g ( n) or sufficienly large n. • f (n) ( g (n)) means that there exists some constant c1 and c2 s.t. c1 g (n) f (n) c 2 g (n) for large enough n. Asymptotic notation Graphic examples of , O, and . Example 1. 1 2 2 f ( n ) n 3 n ( n ) Show that 2 We must find c1 and c2 such that 1 2 2 2 c1n n 3n c2 n 2 Dividing bothsides by n2 yields 1 3 c1 c2 2 n 1 2 For n0 7 , n 3n (n 2 ) 2 Theorem • For any two functions f (n) and g (n) , we have f (n) ( g (n)) if and only if f (n) O( g (n)) and f (n) ( g (n)). Example 2. f (n) 3n 2 2n 5 (n 2 ) dir. Çünkü 3n 2 2n 5 (n 2 ) 3n 2 2n 5 O(n 2 ) Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O(n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O(n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O(n) since for any c, cn 3n 2 when n c Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O(n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O(n) since for any c, cn 3n 2 when n c 3n 2 100 n 6 (n 2 ) since for c 2, 2n 2 3n 2 100 n 6 when n 100 Example 3. 3n 2 100 n 6 O (n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O (n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O (n) since for any c, cn 3n 2 when n c 3n 2 100 n 6 (n 2 ) since for c 2, 2n 2 3n 2 100 n 6 when n 100 3n 2 100 n 6 (n 3 ) since for c 3, 3n 2 100 n 6 n 3 when n 3 Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O(n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O(n) since for any c, cn 3n 2 when n c 3n 2 100 n 6 (n 2 ) since for c 2, 2n 2 3n 2 100 n 6 when n 100 3n 2 100 n 6 (n 3 ) since for c 3, 3n 2 100 n 6 n 3 when n 3 3n 2 100 n 6 (n) since for any c, cn 3n 2 100 n 6 when n 100 Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O(n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O(n) since for any c, cn 3n 2 when n c 3n 2 100 n 6 (n 2 ) since for c 2, 2n 2 3n 2 100 n 6 when n 100 3n 2 100 n 6 (n 3 ) since for c 3, 3n 2 100 n 6 n 3 when n 3 3n 2 100 n 6 (n) 3n 2 100 n 6 (n 2 ) since for any c, cn 3n 2 100 n 6 when n 100 since both O and apply. Example 3. 3n 2 100 n 6 O (n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O (n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O(n) since for any c, cn 3n 2 when n c 3n 2 100 n 6 (n 2 ) since for c 2, 2n 2 3n 2 100 n 6 when n 100 3n 2 100 n 6 (n 3 ) since for c 3, 3n 2 100 n 6 n 3 when n 3 3n 2 100 n 6 (n) 3n 2 100 n 6 (n 2 ) since for any c, cn 3n 2 100 n 6 when n 100 since both O and apply. 3n 2 100 n 6 (n 3 ) since only O applies. Example 3. 3n 2 100 n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100 n 6 3n 2 100 n 6 O(n 3 ) since for c 1, n 3 3n 2 100 n 6 when n 3 3n 2 100 n 6 O(n) since for any c, cn 3n 2 when n c 3n 2 100 n 6 (n 2 ) since for c 2, 2n 2 3n 2 100 n 6 when n 100 3n 2 100 n 6 (n 3 ) since for c 3, 3n 2 100 n 6 n 3 when n 3 3n 2 100 n 6 (n) 3n 2 100 n 6 (n 2 ) since for any c, cn 3n 2 100 n 6 when n 100 since both O and apply. 3n 2 100 n 6 (n 3 ) since only O applies. 3n 2 100 n 6 (n) since only applies. o-notation • We use o-notation to denote an upper bound that is not asymptotically tight. • We formally define o( g (n)) as the set f (n) : for any positive constant c 0 o( g (n)) there exist a constants n0 0 s.t. 0 f ( n ) cg ( n ) for all n n 0 lim n f ( n) 0 g ( n) Example 4. f ( n ) 2n 2 O ( n 2 ) : Asymptotically tight f ( n ) 2n O ( n 2 ) : Is not asymptotically tight That is f ( n ) 2n o( n 2 ) But f ( n ) 2n 2 o( n 2 ) ω-notation • We use ω-notation to denote an upper bound that is not asymptotically tight. • We formally define ( g (n)) as the set f (n) : for any positive constant c 0 ( g (n)) there exist a constants n0 0 s.t. 0 cg (n) f (n) for all n n0 The relation f(n) = ω(g(n)) implies that f(n) =∞ lim n→∞ g(n) Example n2 f (n ) (n) 2 But n2 f (n ) (n 2 ) 2 Standard notations and common functions • Floors and ceilings x 1 x x x x 1 Standard notations and common functions • Modular arithmetic For any integer a and positive integer n a mod n a a / nn Standard notations and common functions • Polynomials: Given a nonnegative integer d, a polynomial in n of degree d is d p ( n) a i n i 0 i Standard notations and common functions • Exponentials: Given a nonnegative integer d, a polynomial in n of degree d is 2 3 x x e 1 x 2! 3! x Standard notations and common functions • Logarithms: lg n log 2 n ln n log e n log k n (log n) k lg lg n lg(lg n) Standard notations and common functions • Logarithms: For all real a>0, b>0, c>0, and n ab log b a log c (ab) log c a log c b log b a n n log b a log c a log b a log c b Standard notations and common functions • Logarithms: log b (1 / a) log b a a log b c c log b a 1 log b a log a b Standard notations and common functions • Series expansion: For x 1 x 2 x3 x 4 x5 ln( 1 x) x 2 3 4 5 For x 1 x ln( 1 x) x (1 x) Standard notations and common functions • Factorials For n 0 the Stirling approximation: n n! 2n e n 1 1 n n! o(n n ) n! (2 n ) lg( n!) (n lg n) Designing algorithms There are many ways to design algorithms: • Insertion sort uses an incremental approach • Merge sort uses divide-and-conquer approach Insertion sort analysis Merge Sort Merge Sort MERGE_SORT(A,p,r) 1 if p<r 2 then q←[p+r)/2] 3 MERGE_SORT(A,p,q) 4 MERGE_SORT(A,q+1, r) 5 MERGE(A,p,q,r) Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Merging two sorted arrays Analyzing merge sort Recurrence for merge sort Recursion tree Recursion tree Recursion tree Recursion tree Recursion tree Recursion tree Recursion tree Recursion tree Recursion tree Recursion tree
© Copyright 2026 Paperzz