COL106: Data Structures and Algorithms (IIT Delhi, Semester-II-2016-17)
1. Answer the following:
(a) State true or false: 2
√
log2 n
Homework-01
is O(n).
(a)
True
(b) Give reason for your answer to part (a).
√
log2 n
Solution: Since n can be written as 2√
, we have that 2 log2 n ≤ 1 · n for all n > 1. So, for
c=1√
and n0 = 1, we have ∀n ≥ n0 , 2 log2 n ≤ c · n. So, from the definition of big-O, we get
that 2 log2 n is O(n).
2. Answer the following:
(a) State true or false: 3n is O(2n ).
(a)
False
(b) Give reason for your answer to part (a).
Solution: For functions f (n) and g(n), f (n) is not O(g(n)) iff for all constants c > 0, n0 ≥ 0,
there exists n ≥ n0 such that f (n) > c · g(n). So, in order to argue that 3n is not O(2n ), we
need to show that for any constants c > 0, n0 ≥ 0, there exists n ≥ n0 such that 3n > c · 2n .
Note that 3n > c · 2n ⇔ (3/2)n > c ⇔ n > log3/2 c. So, for any constant c, 3n > c · 2n when
0
0
n > log3/2 c. This implies that 3n > c · 2n , where n0 = max (c, n0 ). Note that n0 ≥ n0 . So,
from the definition of big-O, we get that 2n is not O(g(n)).
3. Consider functions f (n) = 10n2n + 3n and g(n) = n3n . Answer the following:
(a) State true or false: f (n) is O(g(n)).
(a)
True
(b)
False
(b) State true or false: f (n) is Ω(g(n)).
(c) Give reason for your answer to part (b).
Solution: f (n) is Ω(g(n)) iff there exists constants c > 0, n0 ≥ 0 such that for all n ≥ n0
f (n) ≥ c · g(n). So, f (n) is not Ω(g(n)) iff for all constants c > 0, n0 ≥ 0, there exists n ≥ n0
such that f (n) < c · g(n). Note that (c/2)n3n > 3n when n > 2/c for any constant c. Moreover,
note that (c/2)n3n > 10n2n ⇔ (3/2)n > 20/c ⇔ n > log3/2 (20/c).
Combining the previous two statements, we get that for any constant c > 0, cn3n > (10n2n +3n )
when n > max (2/c, log3/2 (20/c)). This further implies that for any constants c > 0, n0 ≥ 0,
0
0
0
cn0 3n > (10n0 2n + 3n ) for n0 = max (2c, log3/2 (20c), n0 ). Note that n0 ≥ n0 . So, for any
constants c > 0, n0 ≥ 0, there is a number n ≥ n0 (n0 above is such a number) such that
cn3n > (10n2n + 3n ). This implies that f (n) is not Ω(g(n)).
1 of 4
COL106: Data Structures and Algorithms (IIT Delhi, Semester-II-2016-17)
1
21
4. Show using induction that for all n ≥ 0, 1 +
+
1
22
+
Solution: Let P (i) denote the statement that 1 +
to show that P (i) is true for all i.
1
23
1
21
+ ... +
+ ... +
1
2n
1
2i
=
=
Homework-01
1−( 12 )n+1
.
1− 21
1−( 21 )i+1
.
1− 12
We will use induction
Basis step: For the base case we will argue that P (0) is true. This is indeed the case since for i = 0
the summation only has one term 1 and
1−( 21 )0+1
1− 21
= 1.
Inductive step: Assume that P (0), P (1), ..., P (k) are true for arbitrary k. We will show that P (k +1)
is true. This follows from the following sequence of equalities.
1
1
1
1
1
1
1
1
1
1
1 + 1 + 2 + 3 + ... + k + k+1 =
1 + 1 + 2 + 3 + ... + k + k+1
2
2
2
2
2
2
2
2
2
2
=
1 − ( 12 )k+1
1
+ k+1
1
2
1− 2
(This is using the induction hypothesis that P (k) is true)
=
1
1 − ( 12 )k+1
2k+2
+
1 − 12
1 − 12
=
1 − ( 12 )k+2
1 − 12
This shows that P (k + 1) is true. So, from the principle of induction, we get that P (n) holds for all
n ≥ 0.
5. Consider the following recursive function:
F(n)
- If (n > 1) F(n/2)
- Print(“Hello World”)
Let R(n) denote the number of times this function prints “Hello World” given the positive integer n as
input.
(a) What is R(n), in big-O notation as a function of n?
(a)
O(log2 n)
(b) Give reason for your answer to part (a).
Solution: We will show that R(n) = dlog2 ne + 1 and then conclude that R(n) = O(log2 n). In
order to argue that R(n) = dlog2 ne + 1, we will argue that for all n such that 2k < n ≤ 2k+1 ,
R(n) = k + 2 for all values of k. This we show using induction.
Let P (i) denote the statement that for all n such that 2i < n ≤ 2i+1 , R(n) = i + 2.
Basis step: P (0) is true since for all n, 1 < n ≤ 2, the function prints “Hello World” twice.
Induction step: Assume that P (0), P (1), ..., P (k) are true. We will argue that P (k + 1) is true.
Consider any number n, 2k+1 < n ≤ 2k+2 . When the function F is called with such a number
2 of 4
COL106: Data Structures and Algorithms (IIT Delhi, Semester-II-2016-17)
Homework-01
n as input it makes a recursive call to F (n/2). So the total number of times “Hello World” is
printed will be R(n/2) + 1. Note that 2k < n/2 ≤ 2k+1 . So, from the induction hypothesis, we
get that R(n/2) = k + 2. So, R(n) = R(n/2) + 1 = k + 3. This shows that P (k + 1) is true.
Using the principle of induction, we get that P (i) is true for all i.
6. Consider the following recursive algorithm that is supposed to convert any positive integer in decimal to
binary format. b.c denotes the floor function, n%2 denotes the remainder when n is divided by 2, and k
denotes concatenation.
RecDecimalToBinary(n)
- if(n = 0 or n = 1)return(n)
-return(RecDecimalToBinary(bn/2c) k n%2)
Prove that the above algorithm is correct.
Solution: We prove using induction on the input n. Let P (i) denote the proposition that the
algorithm outputs the correct binary representation of i.
Basis step: P (0) and P (1) are true because the algorithm simply outputs n when n is 0 or 1 which
is the correct binary representation.
Inductive step: Assume that P (0), ..., P (k) are true for an arbitrary k. Consider what happens when
k + 1 > 1 is given as input to the algorithm. Let v = b(k + 1)/2c and let br−1 br−2 ...b0 be the binary
Pr−1
representation of v. So, we have v = i=0 2i · bi . From the induction hypothesis, we know that the
program outputs br−1 br−2 ...b0 ||(k + 1)%2. To show that P (k + 1) holds it will be sufficient to prove
the following claim:
Pr−1
Claim: k + 1 = i=0 2i+1 · bi + (k + 1)%2.
Proof. The claim follows from the following sequence of equalities.
r−1
X
2i+1 · bi + (k + 1)%2
r−1
X
2i · bi + (k + 1)%2
=
2·
=
2 · v + (k + 1)%2
=
2 · b(k + 1)/2c + (k + 1)%2
=
k+1
i=0
i=0
Using the principle of mathematical induction, we get that P (i) holds for all i.
7. Show that:
(a) If d(n) = O(f (n)) and f (n) = O(g(n)), then d(n) = O(g(n)).
3 of 4
COL106: Data Structures and Algorithms (IIT Delhi, Semester-II-2016-17)
Homework-01
Solution: d(n) = O(f (n)) implies that there exists constants c > 0 and n0 ≥ 1 such that
∀n ≥ n0 , d(n) ≤ c · f (n). Furthermore, f (n) = O(g(n)) implies that there exists constants
c0 > 0 and n00 ≥ 1 such that ∀n ≥ n00 , f (n) ≤ c0 · g(n). Combining the previous two statements,
we get that ∀n ≥ max {n0 , n00 }, d(n) ≤ (c · c0 ) · g(n). This implies that for constants c00 = (c · c0 )
and n000 = max {n0 , n00 }, we have that ∀n ≥ n000 , d(n) ≤ c00 · g(n). This means d(n) = O(g(n)).
(b) max {f (n), g(n)} = O(f (n) + g(n)).
Solution: max {f (n), g(n)} ≤ f (n) + g(n) = O(f (n) + g(n)).1 The last equality holds since for
any function h(n), h(n) = O(h(n)) is true from the definition of big-O (since ∀n ≥ 1, h(n) ≤
1 · h(n)).
(c) If a(n) = O(f (n)) and b(n) = O(g(n)), then a(n) + b(n) = O(f (n) + g(n)).
Solution: a(n) = O(f (n)) implies that there exists constants c > 0 and n0 ≥ 1 such that
∀n ≥ n0 , a(n) ≤ c·f (n). Furthermore, b(n) = O(g(n)) implies that there exists constants c0 > 0
and n00 ≥ 1 such that ∀n ≥ n00 , b(n) ≤ c0 · g(n). Combining the previous two statements, we get
that ∀n ≥ max {n0 , n00 }, a(n)+b(n) ≤ max {c, c0 }·(f (n)+g(n)). This implies that for constants
c00 = max {c, c0 } and n000 = max {n0 , n00 }, we have that ∀n ≥ n000 , a(n) + b(n) ≤ c00 · (f (n) + g(n)).
This means a(n) + b(n) = O(f (n) + g(n)).
8. Consider the two algorithms given below. In the input, A denotes an integer array and n denotes the
size of the array. Analyse the running time of these algorithms and express the running time in big-O
notation.
Alg1(A, n)
- for i = 1 to n
-j←i
- while(j < n)
- A[j] ← A[j] + 10
- j ←j+3
Alg2(A, n)
- for i = 1 to n
- for j = 2i to n
- A[i] ← A[j] + 1
≤c·n
≤c·n
Pn
≤ c · Pi=1 (n − i)/3
n
≤ c · Pi=1 (n − i)/3
n
≤ c · i=1 (n − i)/3
≤c·n
Pbn/2c
≤ c · i=1 (n − 2i)
Pbn/2c
≤ c · i=1 (n − 2i)
Solution: The upper bound on the worst case number of operations for each line is written by its
side (c is some constant).
Pn Adding everything up, we get that the worst-case running time for Alg1
is at most 2cn + 3c i=1 (n − i)/3 = 2cn + n2 − n(n + 1)/2 = O(n2 ). The worst-case running time
Pbn/2c
of Alg2 is at most cn + 2c i=1 (n − 2i) = O(n2 ).
4 of 4
© Copyright 2026 Paperzz