Analysing Algorithms (time complexity)
Arash Rafiey
August 30, 2016
Arash Rafiey
Analysing Algorithms (time complexity)
Usually interested in running time (but sometimes also memory
requirements).
Example: One of the simplest sorting algorithms
Input : n numbers in array A[1], . . . , A[n]
1. for (i = 1; i < n; i + + ){
2.
element = A[i]; int index = i;
3.
for (j = i + 1; j <= n; j + +)
4.
if (A[j] < element) {
5.
6.
element = A[j]; index = j;
}
7. SWAP(A[i], A[index]);
8. }
Arash Rafiey
Analysing Algorithms (time complexity)
Example: n = 6, A = [14, 13, 12, 15, 16, 11]
i
1
2
3
4
5
element
11
12
13
14
15
index
5
2
2
5
5
Arash Rafiey
“new” A[]
[11, 13, 12, 15, 16, 14]
[11, 12, 13, 15, 16, 14]
[11, 12, 13, 15, 16, 14]
[11, 12, 13, 14, 16, 15]
[11, 12, 13, 14, 15, 16]
Analysing Algorithms (time complexity)
Simple sort example
Clearly depending on n (loops depend on
n)
– Body of outer loop (over i) is executed
n − 1 times
– Each increment takes constant time, c1
– Line 2 takes constant time, c2
– Body of inner loop (over j) is executed
n − i times
– Again, each increment takes time c1
– Suppose comparison in line 4 takes constant time, c3
– If condition is true, then another c2 ,
otherwise 0
– Thus lines 4–6 take at most c3 + c2
– Swap in line 7 takes constant time, c4
Arash Rafiey
Input : n numbers in array
A[1], . . . , A[n]
1. for (i = 1; i < n; i + + ){
2.
3.
element = A[i]; index = i;
for (j = i + 1; j <= n; j + +)
4.
if (A[j] < element) {
5.
6.
element = A[j]; index = j;
}
7. SWAP(A[i], A[index]);
8. }
Analysing Algorithms (time complexity)
Putting everything together: (the worst case)
n
n−1
X
X
c1 + c2 +
c1 + c3 + c2 + c4
i=1
j=i+1
Arash Rafiey
Analysing Algorithms (time complexity)
Putting everything together: (the worst case)
n
n−1
X
X
c1 + c2 +
c1 + c3 + c2 + c4
i=1
j=i+1
Let’s simplify this expression a bit:
Let d1 = c1 + c2 + c4
Let d2 = c1 + c2 + c3
Note: d1 and d2 are constants
Arash Rafiey
Analysing Algorithms (time complexity)
Putting everything together: (the worst case)
n
n−1
X
X
c1 + c2 +
c1 + c3 + c2 + c4
j=i+1
i=1
Let’s simplify this expression a bit:
Let d1 = c1 + c2 + c4
Let d2 = c1 + c2 + c3
Note: d1 and d2 are constants
Now we have
n
n−1
X
X
d1 +
d2
i=1
Note that
Pn
j=i+1 d2
j=i+1
= (n − i) · d2 = n · d2 − i · d2
Arash Rafiey
Analysing Algorithms (time complexity)
n−1
X
(d1 + n · d2 − i · d2 )
i=1
Terms d1 and n · d2 do not depend on i, so this is equal to
(n − 1) · d1 + (n − 1) · n · d2 − d2 ·
n−1
X
i
i=1
We know that
k
X
i = 1 + 2 + 3 + ··· + k =
i=1
Arash Rafiey
k · (k + 1)
2
Analysing Algorithms (time complexity)
Thus the expression becomes
(n − 1) · d1 + (n − 1) · n · d2 − d2 ·
(n − 1) · n
2
(n − 1) · n
2
2
= n · d1 − d1 + n · d2 /2 − n · d2 /2
= (n − 1) · d1 + d2 ·
= n2 · d2 /2 + n · (d1 − d2 /2) − d1
Arash Rafiey
Analysing Algorithms (time complexity)
Thus the expression becomes
(n − 1) · d1 + (n − 1) · n · d2 − d2 ·
(n − 1) · n
2
(n − 1) · n
2
2
= n · d1 − d1 + n · d2 /2 − n · d2 /2
= (n − 1) · d1 + d2 ·
= n2 · d2 /2 + n · (d1 − d2 /2) − d1
With e1 = d2 /2 and e2 = d1 − d2 /2 (note: e1 and e2 are
constants) we obtain
e1 · n2 + e2 · n − d1
Since e1 , e2 , and d1 are constants, the running time depends
quadratically on n.
This is the idea behind asymptotic analysis: we don’t care about
constants (either multiplicative or additive), or about lower-order
terms.
Arash Rafiey
Analysing Algorithms (time complexity)
Theta-notation
For a given function g (n), Θ(g (n)) denotes the set
Θ(g (n)) = {f (n) :
there exist positive constants
c1 , c2 , n0 such that
c1 · g (n) ≤ f (n) ≤ c2 · g (n)
for all n ≥ n0 }
Intuition: f (n) belongs to the family Θ(g (n)) if ∃ constants c1 , c2
s.t. f (n) can fit between c1 · g (n) and c2 · g (n), for all n
sufficiently large.
Correct notation: f (n) ∈ Θ(g (n))
Usually used: f (n) = Θ(g (n)).
We also say that “f (n) is in Θ(g (n))”.
Arash Rafiey
Analysing Algorithms (time complexity)
Examples of Θ-notation:
f (n) = 2n2 = Θ(n2 )
because with g (n) = n2 and c1 = 1 and c2 = 2 we have
0 ≤ c1 g (n) ≤ f (n) = 2 · n2 ≤ c2 · g (n).
Arash Rafiey
Analysing Algorithms (time complexity)
Examples of Θ-notation:
f (n) = 2n2 = Θ(n2 )
because with g (n) = n2 and c1 = 1 and c2 = 2 we have
0 ≤ c1 g (n) ≤ f (n) = 2 · n2 ≤ c2 · g (n).
f (n) = 8n5 + 17n4 − 25 = Θ(n5 )
because f (n) ≥ 7 · n5 for n large enough
n
8n5 + 17n4 − 25
n5 7n5
1
8 · 1 + 17 · 1 − 25 = 0
1
7
2 8 · 32 + 17 · 16 − 25 = 503 32 224
and f (n) ≤ 8n5 + 17n5 = 25n5 , thus c1 = 7, c2 = 25 and n0 = 2
are good enough.
Arash Rafiey
Analysing Algorithms (time complexity)
More intuition:
for all n ≥ n0 , the function f (n) is equal to g (n) to within a
constant factor.
We say that g (n) is an asymptotically tight bound for f (n).
Arash Rafiey
Analysing Algorithms (time complexity)
More intuition:
for all n ≥ n0 , the function f (n) is equal to g (n) to within a
constant factor.
We say that g (n) is an asymptotically tight bound for f (n).
Back to sorting example
We had running time T (n) = e1 · n2 + e2 · n − d1 . Now we can say:
T (n) = Θ(n2 ),
we have formal means to “get rid” of lower-order terms and
constant coefficients.
Arash Rafiey
Analysing Algorithms (time complexity)
Why is this true?
Must find positive c1 , c2 , n0 such that
c1 n2 ≤ e1 n2 + e2 n − d1 ≤ c2 n2
for all n ≥ n0 .
Dividing by n2 gets us
c1 ≤ e1 +
e2 d1
− 2 ≤ c2
n
n
Suppose e1 , e2 , d1 are positive (other cases similar).
Obviously, for n ≥ e2 we have e2 /n ≤ 1 and thus
e1 + e2 /n − d1 /n2 ≤ e1 + 1 and thus c2 = e1 + 1 and n0 = e2 does
the job for the right-hand√inequality.
Also, for n2 ≥ d1 ⇔ n ≥ d1 we have d1 /n2 ≤ 1 and thus
e1 + e2 /n − d1 /n2 ≥√e1 − 1 and therefore c1 = e1 − 1 is sufficient.
With n0 = max{e2 , d1 } both conditions are fulfilled
simultaneously.
Arash Rafiey
Analysing Algorithms (time complexity)
Theta notation respects the main term
However, n3 6= Θ(n2 )
Recall: for n3 = Θ(n2 ) we would have to find constants c1 , c2 , n0
with
0 ≤ c1 n 2 ≤ n 3 ≤ c2 n 2
for n ≥ n0 .
Intuition: there’s a factor of n between both functions, thus we
cannot find a constant c2 !
Suppose, for purpose of contradiction, that there are constants c2
and n0 with n3 ≤ c2 · n2 for n ≥ n0 .
Dividing by n2 yields n ≤ c2 , which cannot possibly hold for
arbitrarily large n (c2 must be a constant).
Arash Rafiey
Analysing Algorithms (time complexity)
Big-O-notation
When we’re interested in asymptotic upper bounds only, we use
O-notation (read: “big-O”).
For given function g (n), define O(g (n)) (read: “big-O of g of n”
or also “order g of n”) as follows:
O(g (n)) = {f (n) :
there exist positive constants
c, n0 such that
f (n) ≤ c · g (n)
for all n ≥ n0 }
We write f (n) = O(g (n)) to indicate that f (n) is member of set
O(g (n)).
Obviously, f (n) = Θ(g (n)) implies f (n) = O(g (n)); we just drop
the left inequality in the definition of Θ(g (n)).
Arash Rafiey
Analysing Algorithms (time complexity)
Big-Omega-notation
Like O-notation, but for lower bounds
For a given function g (n), Ω(n) denotes the set
Ω(g (n)) = {f (n) :
there exist positive constants
c, n0 such that
c · g (n) ≤ f (n)
for all n ≥ n0 }
Saying T (n) = Ω(n2 ) means growth of T (n) is at least the of n2 .
Clearly, f (n) = Θ(g (n)) iff f (n) = Ω(g (n)) and f (n) = O(g (n)).
Arash Rafiey
Analysing Algorithms (time complexity)
o-notation
Similar to O
f (n) = O(g (n)) means we can upper-bound the growth of f by
the growth of g (up to a constant factor)
f (n) = o(g (n)) is the same, except we require the growth of f to
be strictly smaller than the growth of g :
For a given function g (n), o(n) denotes the set
o(g (n)) = {f (n) :
for any pos constant c
there exists a pos constant n0
such that
c · f (n) < g (n)
for all n ≥ n0 }
Arash Rafiey
Analysing Algorithms (time complexity)
Intuition: f (n) becomes insignificant relative to g (n) as n
approaches infinity:
f (n)
lim
=0
n→∞ g (n)
In other words, f is o(something) if there is no constant factor
between f and something.
Examples:
n = o(n2 )
log n = o(n)
n = o(2n )
n1,000 = o(1.0001n )
1 = o(log n)
Arash Rafiey
Analysing Algorithms (time complexity)
Example
Show that n = o(n2 ).
Arash Rafiey
Analysing Algorithms (time complexity)
Example
Show that n = o(n2 ).
We need to show that for every c > 0, ∃ an n0 such that for every
n ≥ n0 we have cn < n2 .
This means c < n and hence if we set n0 = dce we have cn < n2
for every n ≥ n0 .
Arash Rafiey
Analysing Algorithms (time complexity)
Example
Show that n = o(n2 ).
We need to show that for every c > 0, ∃ an n0 such that for every
n ≥ n0 we have cn < n2 .
This means c < n and hence if we set n0 = dce we have cn < n2
for every n ≥ n0 .
Show that log n = o(n).
Arash Rafiey
Analysing Algorithms (time complexity)
Example
Show that n = o(n2 ).
We need to show that for every c > 0, ∃ an n0 such that for every
n ≥ n0 we have cn < n2 .
This means c < n and hence if we set n0 = dce we have cn < n2
for every n ≥ n0 .
Show that log n = o(n).
We need to show that for every c > 0, ∃ an n0 such that for every
n ≥ n0 we have c log n < n.
Set n = 2m . Now we need for find an m such that cm < 2m for
every m > m0 .
We note that 2m ≤ 2m for every positive integer m > 0. Now if we
set m = 2dce then we have c < 2dce and also 2dce ≤ 2dce .
Therefore c(2dce) < 22dce and hence we set n0 = 22dce .
Arash Rafiey
Analysing Algorithms (time complexity)
omega-notation
ω is to Ω what o is to O:
f (n) = ω(g (n)) iff g (n) = o(f (n))
For a given function g (n), ω(n) denotes the set
ω(g (n)) = {f (n) :
for any pos constant c
there exists a pos constant n0
such that
c · g (n) < f (n)
for all n ≥ n0 }
In other words:
lim
n→∞
f (n)
=∞
g (n)
if the limit exists.
I.e., f (n) becomes arbitrarily large relative to g (n).
Arash Rafiey
Analysing Algorithms (time complexity)
Exercises
1) Arrange the following functions in ascending order
2.5
f1 (n) = n
√
f2 (n) = 2n
f3 (n) = n + 10
f4 (n) = 10n
f5 (n) = 100n
f6 (n) = n2 log n.
Arash Rafiey
Analysing Algorithms (time complexity)
√
f2 (n) = 2n
f3 (n) = n + 10
f6 (n) = n2 log n
f1 (n) = n2.5
f4 (n) = 10n
f5 (n) = 100n
Arash Rafiey
Analysing Algorithms (time complexity)
2) Arrange√ the following functions in ascending order
g1 (n) = 2 log n
g2 (n) = 2n
4
g3 (n) = n 3
g4 (n) = n(logn)3
g5 (n) = nlogn
n
g6 (n) = 22
2
g7 (n) = 2n
Arash Rafiey
Analysing Algorithms (time complexity)
© Copyright 2026 Paperzz