Complexity How to Measure Time Complexity? - UW

What is an Algorithm?
! 
! 
Class #02:
Analysis of Algorithms I
Statue of Muhammad ibn Musa al-Khwarizmi (Uzbekistan)"
Original author of Algoritmi de numero Indorum"
An algorithm is not the same thing as a program
! 
Software Design III (CS 340): M. Allen, 27 Jan. 16
! 
! 
Programs are concrete implementations of algorithms
There is usually more than one possible implementation for the same
algorithm
Wednesday, 27 Jan. 2016"
Complexity
! 
! 
! 
! 
! 
Time complexity: how much time will the algorithm take to complete?
Space complexity: how much storage will the algorithm require?
! 
! 
Must actually implement the algorithm
Results are system dependent (language, OS, hardware)
Results are for the tested input only, and may not “prove”
anything about data that hasn’t been tested.
Real time systems or interactive interfaces
! 
Sometimes space complexity must be minimized:
! 
2"
Empirical: implement the algorithm and run it to measure
how long it takes on a variety of inputs
! 
Sometimes time complexity must be minimized:
! 
Software Design III (CS 340)"
How to Measure Time Complexity?
An algorithm can be characterized in terms of its complexity:
! 
An algorithm is a welldefined, step-by-step
procedure
An algorithm (generally)
takes some input and
produces some output
Mobile devices with limited memory
Analytical: place upper and/or lower bounds on the length of
time an algorithm will take by counting the number of
operations required, based solely on a specification of how
the algorithm works
! 
! 
! 
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
3"
Do not need to implement the algorithm
Results are not dependent on a particular system
Can be determined for all cases without performing any tests
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
4"
1
Example of Empirical Testing
! 
Asymptotic complexity
There are a number of methods to sort an array. The following table lists the time
in milliseconds that each sorting method took on an array of randomly generated
integers, taking median over three separate runs. (Numbers in italics are estimates.)
! 
Complexity refers to rate at which time or memory
requirements grows (as a function of problem size)
! 
Experimental Test Data on 400 Mhz Pentium II RedHat Linux Machine
100
1000
2000
4000
10,000
Bubble
1
131
439
1954
7612
Selection
1
35
179
588
4280
100,000
1,000,000
10,000,000
760000
76000000
7600000000
(12.67 min)
(21.11 hrs)
(87.96 days)
428000
42800000
4280000000
(7.13 min)
(11.89 hrs)
(49.54 days)
242500
24250000
2425000000
(4.04 min)
(6.74 hrs)
(28.07 days)
! 
The absolute, actual growth depends on the machine used to
execute the language, the compiler, the hardware, the type of
memory, and many other factors
Better to have a way of describing the inherent complexity of a
program independent of the platform used
Insertion
1
30
98
336
2425
Merge
3
5
8
15
47
521
5580
60000
Requires a ‘proportionality’ approach: express complexity
in terms of its relationship to some known function
Quick
1
3
4
7
17
189
2290
27006
! 
time in ms (second/1000)
!
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
5"
! 
! 
! 
! 
! 
! 
Arithmetic operations (+, -, /, *, %)
Logical operations (!, &&, ||)
Comparisons (<,==,>,<=, >=, !=)
Assignment (=)
Method calls and method returns
Array access…
! 
Operations can actually run at different speeds (depending on the
CPU and operating system) but we assume (for simplicity) that each
operation takes the same amount of time to complete
! 
Time an algorithm takes on input of size N is then given as T (N ):
T (N ) = |Ops| ⇥
Wednesday, 27 Jan. 2016"
Wednesday, 27 Jan. 2016"
6"
Asymptotic notation is used to compare the growth rates
of two different functions:
T (N ) = O(f (N ))
T (N )  c f (N )
T (N ) = ⌦(g(N ))
g(N )  c T (N )
T (N ) = ⇥(h(N ))
T ime
Op
Software Design III (CS 340)"
Software Design III (CS 340)"
Asymptotic Notation
Each CPU operation takes a discrete amount of time
! 
This type of analysis is known as asymptotic analysis
Data: K. Hunt
Simplifying asymptotic analysis
! 
! 
7"
Wednesday, 27 Jan. 2016"
T (N ) = O(h(N )),
T (N ) = ⌦(h(N ))
Software Design III (CS 340)"
8"
2
Asymptotic Notation
Big O example
! 
To be exact, notation means there exists some constant
bound on performance and that the given relationship holds
once input size gets to be large enough
! 
Big-O: Function T (N ) is “no greater than” function g (n ) if,
once input size is greater than or equal to some point x, T (N )
is no greater than g (n ) multiplied by some constant c :
T (N ) = O(g(n)) ⌘ 9c, 9x, 8y
! 
x, T (y)  cg(y)
Big-Omega: T(N) is “no less than” g(n), when reverse true:
T (N ) = ⌦(g(n)) ⌘ 9c, 9x, 8y
! 
x, g(y)  c T (y)
Big-Theta: T(N) is “the same as” g(n), when both are true:
T (N ) = ⇥(g(n)) ⌘ T (n) = O(g(n)) and T (n) = ⌦(g(n))
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
9"
Big O example
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
10"
Software Design III (CS 340)"
12"
Big O example
! 
As we can easily see from the graph,
when we reach the point x ≥ 55,
7x + 600 ≤ 18x
! 
And so:
f (x)  1 ⇥ g(x)
f (x) = O(g(x))
! 
Similarly, we have:
18x < 21x < 21x + 1800
While the functions are obviously
not actually equal, the Big-Theta
relationship means that they have
the same order of complexity.
They are “in the same ball-park”
for the actual run-time.
Wednesday, 27 Jan. 2016"
g(x)  3 ⇥ f (x)
g(x) = O(f (x))
! 
This means that the two functions are
effectively “equal”:
f (x) = ⇥(g(x))
Software Design III (CS 340)"
11"
Wednesday, 27 Jan. 2016"
3
Big O example
Orders of Common Functions
! 
Here, the graph again shows that when we
reach the point x ≥ 90,
18x ≤ 1/5 x2
! 
And so:
g(x)  1 ⇥ f (x)
g(x) = O(f (x))
! 
This is different than the previous
example. The two functions are
not in the same basic ball-park.
Here, the function f(x) is actually
more complex than function g(x)
Wednesday, 27 Jan. 2016"
For the other direction, it is more complicated:
while the graph shows that f(x) eventually
grows larger than g(x) once x is big enough, we
need to see if there is some constant such that:
f (x)  c ⇥ g(x)
! 
It turns out there is no such fixed, finite
constant can possibly exist, and so we can say
that f(x) is “larger than” g(x):
f (x) = ⌦(g(x))
f (x) 6= O(g(x))
Software Design III (CS 340)"
13"
Notation
Name
O(1)
constant
O(log n)
logarithmic
O(n)
linear
O(n log n)
log-linear
O(n2)
quadratic
O(nc)
polynomial
O(cn)
exponential
O(n!)
factorial
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
14"
This Week
! 
Topic: More on algorithmic analysis
! 
Meet: regular classroom this week (no lab yet)
! 
Read: Text, ch. 02
! 
Homework 01: posted to D2L, due Wednesday, 10 Feb.
! 
Office Hours: Wing 210
! 
! 
Tuesday & Thursday: 10:00–11:30 AM
Tuesday & Thursday: 4:00–5:30 PM
Wednesday, 27 Jan. 2016"
Software Design III (CS 340)"
15"
4